# STDOUT: ---v---v---v---v---v--- ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = ['/home/jenkins/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /opt/ansible-2.9/lib/python3.9/site-packages/ansible executable location = /opt/ansible-2.9/bin/ansible-playbook python version = 3.9.18 (main, Sep 7 2023, 00:00:00) [GCC 11.4.1 20230605 (Red Hat 11.4.1-2)] Using /etc/ansible/ansible.cfg as config file Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_lvm_percent_size.yml ******************************************* 1 plays in /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml PLAY [Test specifying size as a percentage] ************************************ TASK [Gathering Facts] ********************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:2 Sunday 28 January 2024 05:59:34 +0000 (0:00:00.012) 0:00:00.012 ******** ok: [sut] META: ran handlers TASK [Run the role] ************************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:17 Sunday 28 January 2024 05:59:36 +0000 (0:00:01.774) 0:00:01.786 ******** TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 28 January 2024 05:59:36 +0000 (0:00:00.019) 0:00:01.805 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 28 January 2024 05:59:36 +0000 (0:00:00.024) 0:00:01.830 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 28 January 2024 05:59:36 +0000 (0:00:00.022) 0:00:01.852 ******** skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_38.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_38.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_38.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_38.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 28 January 2024 05:59:36 +0000 (0:00:00.033) 0:00:01.886 ******** ok: [sut] => { "changed": false, "stat": { "exists": false } } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 28 January 2024 05:59:36 +0000 (0:00:00.261) 0:00:02.147 ******** ok: [sut] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 28 January 2024 05:59:36 +0000 (0:00:00.021) 0:00:02.169 ******** ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 28 January 2024 05:59:36 +0000 (0:00:00.008) 0:00:02.178 ******** ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 28 January 2024 05:59:36 +0000 (0:00:00.008) 0:00:02.187 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 28 January 2024 05:59:36 +0000 (0:00:00.029) 0:00:02.216 ******** changed: [sut] => { "changed": true, "rc": 0, "results": [ "Installed: python3-bytesize-2.10-1.fc38.x86_64", "Installed: lsof-4.96.3-3.fc38.x86_64", "Installed: python3-blivet-1:3.7.1-1.fc38.noarch", "Installed: python3-blockdev-2.28-5.fc38.x86_64", "Installed: libblockdev-btrfs-2.28-5.fc38.x86_64", "Installed: python3-pyparted-1:3.12.0-8.fc38.x86_64", "Installed: blivet-data-1:3.7.1-1.fc38.noarch", "Installed: device-mapper-multipath-0.9.4-2.fc38.x86_64", "Installed: libblockdev-mpath-2.28-5.fc38.x86_64", "Installed: device-mapper-multipath-libs-0.9.4-2.fc38.x86_64", "Installed: libblockdev-dm-2.28-5.fc38.x86_64", "Installed: lzo-2.10-8.fc38.x86_64", "Installed: btrfs-progs-6.6.2-1.fc38.x86_64", "Installed: libblockdev-lvm-2.28-5.fc38.x86_64" ] } lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 28 January 2024 05:59:43 +0000 (0:00:06.774) 0:00:08.991 ******** ok: [sut] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 28 January 2024 05:59:43 +0000 (0:00:00.019) 0:00:09.010 ******** ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 28 January 2024 05:59:43 +0000 (0:00:00.017) 0:00:09.028 ******** ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 28 January 2024 05:59:44 +0000 (0:00:00.749) 0:00:09.777 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 28 January 2024 05:59:44 +0000 (0:00:00.035) 0:00:09.813 ******** skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 28 January 2024 05:59:44 +0000 (0:00:00.015) 0:00:09.829 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 28 January 2024 05:59:44 +0000 (0:00:00.009) 0:00:09.839 ******** skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 28 January 2024 05:59:44 +0000 (0:00:00.014) 0:00:09.853 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 28 January 2024 05:59:47 +0000 (0:00:02.697) 0:00:12.551 ******** ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.home1.service": { "name": "dbus-org.freedesktop.home1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "inactive", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "ssh-host-keys-migration.service": { "name": "ssh-host-keys-migration.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 28 January 2024 05:59:49 +0000 (0:00:02.319) 0:00:14.871 ******** ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 28 January 2024 05:59:49 +0000 (0:00:00.019) 0:00:14.890 ******** TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 28 January 2024 05:59:49 +0000 (0:00:00.011) 0:00:14.901 ******** ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 28 January 2024 05:59:50 +0000 (0:00:00.367) 0:00:15.269 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 28 January 2024 05:59:50 +0000 (0:00:00.012) 0:00:15.282 ******** TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 28 January 2024 05:59:50 +0000 (0:00:00.009) 0:00:15.292 ******** ok: [sut] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 28 January 2024 05:59:50 +0000 (0:00:00.015) 0:00:15.307 ******** ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:129 Sunday 28 January 2024 05:59:50 +0000 (0:00:00.022) 0:00:15.330 ******** ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:145 Sunday 28 January 2024 05:59:50 +0000 (0:00:00.020) 0:00:15.350 ******** TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:157 Sunday 28 January 2024 05:59:50 +0000 (0:00:00.027) 0:00:15.378 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:162 Sunday 28 January 2024 05:59:50 +0000 (0:00:00.020) 0:00:15.398 ******** TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:174 Sunday 28 January 2024 05:59:50 +0000 (0:00:00.019) 0:00:15.417 ******** TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:189 Sunday 28 January 2024 05:59:50 +0000 (0:00:00.012) 0:00:15.429 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:197 Sunday 28 January 2024 05:59:50 +0000 (0:00:00.011) 0:00:15.441 ******** ok: [sut] => { "changed": false, "stat": { "atime": 1705410356.493937, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1705410883.285937, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131081, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1705410356.493937, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3072151005", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:202 Sunday 28 January 2024 05:59:50 +0000 (0:00:00.206) 0:00:15.647 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:224 Sunday 28 January 2024 05:59:50 +0000 (0:00:00.012) 0:00:15.660 ******** ok: [sut] TASK [Get unused disks] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:21 Sunday 28 January 2024 05:59:51 +0000 (0:00:00.773) 0:00:16.434 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/get_unused_disk.yml for sut TASK [Ensure test packages] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/get_unused_disk.yml:2 Sunday 28 January 2024 05:59:51 +0000 (0:00:00.019) 0:00:16.454 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: util-linux-core TASK [Find unused disks in the system] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/get_unused_disk.yml:16 Sunday 28 January 2024 05:59:53 +0000 (0:00:02.433) 0:00:18.887 ******** ok: [sut] => { "changed": false, "disks": [ "sda" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/get_unused_disk.yml:24 Sunday 28 January 2024 05:59:53 +0000 (0:00:00.279) 0:00:19.167 ******** ok: [sut] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/get_unused_disk.yml:29 Sunday 28 January 2024 05:59:53 +0000 (0:00:00.015) 0:00:19.182 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/get_unused_disk.yml:34 Sunday 28 January 2024 05:59:53 +0000 (0:00:00.012) 0:00:19.195 ******** ok: [sut] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of invalid percentage-based size specification.] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:27 Sunday 28 January 2024 05:59:53 +0000 (0:00:00.013) 0:00:19.208 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-failed.yml for sut TASK [Store global variable value copy] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-failed.yml:4 Sunday 28 January 2024 05:59:53 +0000 (0:00:00.020) 0:00:19.229 ******** ok: [sut] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": false, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-failed.yml:10 Sunday 28 January 2024 05:59:54 +0000 (0:00:00.016) 0:00:19.246 ******** TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 28 January 2024 05:59:54 +0000 (0:00:00.037) 0:00:19.283 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 28 January 2024 05:59:54 +0000 (0:00:00.018) 0:00:19.301 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 28 January 2024 05:59:54 +0000 (0:00:00.013) 0:00:19.315 ******** skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_38.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_38.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_38.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_38.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 28 January 2024 05:59:54 +0000 (0:00:00.031) 0:00:19.346 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 28 January 2024 05:59:54 +0000 (0:00:00.011) 0:00:19.357 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 28 January 2024 05:59:54 +0000 (0:00:00.010) 0:00:19.368 ******** ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 28 January 2024 05:59:54 +0000 (0:00:00.010) 0:00:19.378 ******** ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 28 January 2024 05:59:54 +0000 (0:00:00.010) 0:00:19.389 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 28 January 2024 05:59:54 +0000 (0:00:00.026) 0:00:19.415 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 28 January 2024 05:59:56 +0000 (0:00:02.392) 0:00:21.807 ******** ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "2x%" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 28 January 2024 05:59:56 +0000 (0:00:00.015) 0:00:21.823 ******** ok: [sut] => { "storage_volumes": [] } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 28 January 2024 05:59:56 +0000 (0:00:00.014) 0:00:21.837 ******** ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 28 January 2024 05:59:58 +0000 (0:00:01.718) 0:00:23.555 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 28 January 2024 05:59:58 +0000 (0:00:00.021) 0:00:23.577 ******** skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 28 January 2024 05:59:58 +0000 (0:00:00.017) 0:00:23.595 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 28 January 2024 05:59:58 +0000 (0:00:00.010) 0:00:23.605 ******** skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 28 January 2024 05:59:58 +0000 (0:00:00.019) 0:00:23.625 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 28 January 2024 06:00:00 +0000 (0:00:02.437) 0:00:26.062 ******** ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.home1.service": { "name": "dbus-org.freedesktop.home1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "ssh-host-keys-migration.service": { "name": "ssh-host-keys-migration.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 28 January 2024 06:00:03 +0000 (0:00:02.339) 0:00:28.401 ******** ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 28 January 2024 06:00:03 +0000 (0:00:00.036) 0:00:28.438 ******** TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 28 January 2024 06:00:03 +0000 (0:00:00.011) 0:00:28.449 ******** fatal: [sut]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: invalid percentage '2x%' size specified in pool 'foo' TASK [linux-system-roles.storage : Failed message] ***************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:106 Sunday 28 January 2024 06:00:04 +0000 (0:00:01.615) 0:00:30.064 ******** fatal: [sut]: FAILED! => { "changed": false } MSG: {'msg': "invalid percentage '2x%' size specified in pool 'foo'", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': None, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': None, 'fs_label': None, 'fs_type': None, 'mount_options': None, 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '2x%', 'state': 'present', 'type': None, 'cached': None, 'cache_devices': [], 'cache_mode': None, 'cache_size': None, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 28 January 2024 06:00:04 +0000 (0:00:00.014) 0:00:30.079 ******** TASK [Check that we failed in the role] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-failed.yml:29 Sunday 28 January 2024 06:00:04 +0000 (0:00:00.009) 0:00:30.089 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-failed.yml:34 Sunday 28 January 2024 06:00:04 +0000 (0:00:00.012) 0:00:30.102 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-failed.yml:45 Sunday 28 January 2024 06:00:04 +0000 (0:00:00.017) 0:00:30.120 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create two LVM logical volumes under volume group 'foo' using percentage sizes] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:44 Sunday 28 January 2024 06:00:04 +0000 (0:00:00.010) 0:00:30.131 ******** TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 28 January 2024 06:00:04 +0000 (0:00:00.022) 0:00:30.153 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 28 January 2024 06:00:04 +0000 (0:00:00.017) 0:00:30.171 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 28 January 2024 06:00:04 +0000 (0:00:00.013) 0:00:30.184 ******** skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_38.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_38.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_38.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_38.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 28 January 2024 06:00:04 +0000 (0:00:00.029) 0:00:30.213 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 28 January 2024 06:00:04 +0000 (0:00:00.010) 0:00:30.224 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 28 January 2024 06:00:04 +0000 (0:00:00.011) 0:00:30.236 ******** ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 28 January 2024 06:00:05 +0000 (0:00:00.010) 0:00:30.247 ******** ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 28 January 2024 06:00:05 +0000 (0:00:00.010) 0:00:30.258 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 28 January 2024 06:00:05 +0000 (0:00:00.022) 0:00:30.280 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 28 January 2024 06:00:07 +0000 (0:00:02.601) 0:00:32.881 ******** ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "60%" }, { "fs_type": "ext4", "mount_point": "/opt/test2", "name": "test2", "size": "40%" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 28 January 2024 06:00:07 +0000 (0:00:00.016) 0:00:32.898 ******** ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 28 January 2024 06:00:07 +0000 (0:00:00.014) 0:00:32.912 ******** ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "e2fsprogs", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 28 January 2024 06:00:09 +0000 (0:00:01.565) 0:00:34.478 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 28 January 2024 06:00:09 +0000 (0:00:00.021) 0:00:34.499 ******** skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 28 January 2024 06:00:09 +0000 (0:00:00.018) 0:00:34.517 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 28 January 2024 06:00:09 +0000 (0:00:00.010) 0:00:34.528 ******** skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 28 January 2024 06:00:09 +0000 (0:00:00.018) 0:00:34.547 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: e2fsprogs kpartx lvm2 TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 28 January 2024 06:00:11 +0000 (0:00:02.452) 0:00:36.999 ******** ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.home1.service": { "name": "dbus-org.freedesktop.home1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "ssh-host-keys-migration.service": { "name": "ssh-host-keys-migration.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 28 January 2024 06:00:13 +0000 (0:00:02.227) 0:00:39.227 ******** ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 28 January 2024 06:00:14 +0000 (0:00:00.019) 0:00:39.246 ******** TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 28 January 2024 06:00:14 +0000 (0:00:00.010) 0:00:39.256 ******** changed: [sut] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "40%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 28 January 2024 06:00:16 +0000 (0:00:02.985) 0:00:42.242 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 28 January 2024 06:00:17 +0000 (0:00:00.012) 0:00:42.254 ******** TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 28 January 2024 06:00:17 +0000 (0:00:00.010) 0:00:42.264 ******** ok: [sut] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/mapper/foo-test1", "/dev/mapper/foo-test2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "40%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 28 January 2024 06:00:17 +0000 (0:00:00.014) 0:00:42.279 ******** ok: [sut] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "40%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:129 Sunday 28 January 2024 06:00:17 +0000 (0:00:00.014) 0:00:42.294 ******** ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:145 Sunday 28 January 2024 06:00:17 +0000 (0:00:00.012) 0:00:42.306 ******** TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:157 Sunday 28 January 2024 06:00:17 +0000 (0:00:00.010) 0:00:42.317 ******** ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:162 Sunday 28 January 2024 06:00:18 +0000 (0:00:01.129) 0:00:43.446 ******** changed: [sut] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } changed: [sut] => (item={'src': '/dev/mapper/foo-test2', 'path': '/opt/test2', 'fstype': 'ext4', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:174 Sunday 28 January 2024 06:00:19 +0000 (0:00:01.022) 0:00:44.468 ******** skipping: [sut] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [sut] => (item={'src': '/dev/mapper/foo-test2', 'path': '/opt/test2', 'fstype': 'ext4', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:189 Sunday 28 January 2024 06:00:19 +0000 (0:00:00.019) 0:00:44.488 ******** ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:197 Sunday 28 January 2024 06:00:20 +0000 (0:00:00.992) 0:00:45.480 ******** ok: [sut] => { "changed": false, "stat": { "atime": 1706421590.370748, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1705410883.285937, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131081, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1705410356.493937, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3072151005", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:202 Sunday 28 January 2024 06:00:20 +0000 (0:00:00.211) 0:00:45.692 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:224 Sunday 28 January 2024 06:00:20 +0000 (0:00:00.011) 0:00:45.704 ******** ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:62 Sunday 28 January 2024 06:00:21 +0000 (0:00:00.794) 0:00:46.498 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:2 Sunday 28 January 2024 06:00:21 +0000 (0:00:00.023) 0:00:46.521 ******** ok: [sut] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "40%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:7 Sunday 28 January 2024 06:00:21 +0000 (0:00:00.018) 0:00:46.539 ******** skipping: [sut] => {} TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:15 Sunday 28 January 2024 06:00:21 +0000 (0:00:00.030) 0:00:46.569 ******** ok: [sut] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "e0f34bf1-1ed8-4ab6-886d-2081993af588" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "535527a1-073c-487e-98a8-36b792e0b690" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "urYyra-dHsd-6wqb-JL2Q-5qRG-9e7s-c6khtG" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "9db35ec2-66ac-4531-8ad6-ffb8154c9c87" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:20 Sunday 28 January 2024 06:00:21 +0000 (0:00:00.256) 0:00:46.825 ******** ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003401", "end": "2024-01-28 06:00:21.810132", "rc": 0, "start": "2024-01-28 06:00:21.806731" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jan 16 13:05:56 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=9db35ec2-66ac-4531-8ad6-ffb8154c9c87 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/foo-test2 /opt/test2 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:25 Sunday 28 January 2024 06:00:21 +0000 (0:00:00.250) 0:00:47.076 ******** ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003395", "end": "2024-01-28 06:00:22.014391", "failed_when_result": false, "rc": 0, "start": "2024-01-28 06:00:22.010996" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:34 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.204) 0:00:47.280 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml for sut TASK [Set _storage_pool_tests] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:5 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.030) 0:00:47.311 ******** ok: [sut] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:18 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.011) 0:00:47.323 ******** ok: [sut] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.033626", "end": "2024-01-28 06:00:22.293314", "rc": 0, "start": "2024-01-28 06:00:22.259688" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:24 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.236) 0:00:47.560 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:34 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.023) 0:00:47.584 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-volumes.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:2 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.027) 0:00:47.611 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:13 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.018) 0:00:47.629 ******** ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:22 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.271) 0:00:47.901 ******** ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:27 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.015) 0:00:47.917 ******** ok: [sut] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:33 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.016) 0:00:47.933 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:42 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.015) 0:00:47.948 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:48 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.013) 0:00:47.962 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:54 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.014) 0:00:47.976 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:59 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.010) 0:00:47.987 ******** ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:73 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.018) 0:00:48.006 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml for sut TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:8 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.022) 0:00:48.028 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:14 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.011) 0:00:48.039 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:21 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.010) 0:00:48.050 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:28 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.011) 0:00:48.062 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:35 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.011) 0:00:48.073 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:45 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.010) 0:00:48.084 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:54 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.011) 0:00:48.095 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:64 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.010) 0:00:48.106 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:74 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.010) 0:00:48.117 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:85 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.011) 0:00:48.129 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:95 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.011) 0:00:48.140 ******** ok: [sut] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:76 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.011) 0:00:48.151 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-lvmraid.yml for sut TASK [Validate pool member LVM RAID settings] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-lvmraid.yml:2 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.023) 0:00:48.174 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml for sut TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:8 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.030) 0:00:48.205 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:16 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.011) 0:00:48.216 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:21 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.010) 0:00:48.227 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:29 Sunday 28 January 2024 06:00:22 +0000 (0:00:00.011) 0:00:48.239 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:34 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.026) 0:00:48.265 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:40 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.012) 0:00:48.278 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:46 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.289 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:8 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.300 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:16 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.312 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:21 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.323 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:29 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.012) 0:00:48.335 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:34 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.346 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:40 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.357 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:46 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.368 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:79 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.379 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-thin.yml for sut TASK [Validate pool member thinpool settings] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-thin.yml:2 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.024) 0:00:48.404 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml for sut TASK [Get information about thinpool] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:8 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.028) 0:00:48.433 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:16 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.013) 0:00:48.446 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:23 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.457 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:27 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.468 ******** ok: [sut] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Get information about thinpool] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:8 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.010) 0:00:48.479 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:16 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.010) 0:00:48.490 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:23 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.010) 0:00:48.501 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:27 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.012) 0:00:48.513 ******** ok: [sut] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:82 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.524 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:5 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.025) 0:00:48.549 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:13 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.013) 0:00:48.563 ******** skipping: [sut] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:20 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.014) 0:00:48.577 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml for sut TASK [Set variables used by tests] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:2 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.021) 0:00:48.599 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:9 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.013) 0:00:48.612 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:18 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.014) 0:00:48.626 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:27 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.010) 0:00:48.637 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:37 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.649 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:47 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.660 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:27 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.672 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:85 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.683 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-vdo.yml for sut TASK [Validate pool member VDO settings] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-vdo.yml:2 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.026) 0:00:48.709 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml for sut TASK [Get information about VDO deduplication] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:9 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.030) 0:00:48.739 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:16 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.012) 0:00:48.752 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:22 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.763 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:28 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.774 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:35 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.786 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:41 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.797 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:47 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.010) 0:00:48.808 ******** ok: [sut] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Get information about VDO deduplication] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:9 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.819 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:16 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.012) 0:00:48.831 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:22 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.842 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:28 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.853 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:35 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.864 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:41 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.875 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:47 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.010) 0:00:48.886 ******** ok: [sut] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:88 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.012) 0:00:48.898 ******** ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-volumes.yml:3 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:48.909 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:2 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.026) 0:00:48.936 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:21 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.014) 0:00:48.951 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:7 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.088) 0:00:49.040 ******** ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:16 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.015) 0:00:49.055 ******** ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1536347, "block_size": 4096, "block_total": 1555456, "block_used": 19109, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3143677, "inode_total": 3143680, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6292877312, "size_total": 6371147776, "uuid": "e0f34bf1-1ed8-4ab6-886d-2081993af588" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1536347, "block_size": 4096, "block_total": 1555456, "block_used": 19109, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3143677, "inode_total": 3143680, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6292877312, "size_total": 6371147776, "uuid": "e0f34bf1-1ed8-4ab6-886d-2081993af588" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:38 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.018) 0:00:49.074 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:51 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.012) 0:00:49.086 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:63 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.014) 0:00:49.101 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:71 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.013) 0:00:49.114 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:83 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.012) 0:00:49.126 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:95 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:49.137 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:110 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:49.148 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:122 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.014) 0:00:49.163 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:128 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:49.174 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:134 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:49.185 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:146 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.012) 0:00:49.197 ******** ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:2 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.011) 0:00:49.209 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:40 Sunday 28 January 2024 06:00:23 +0000 (0:00:00.024) 0:00:49.233 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:48 Sunday 28 January 2024 06:00:24 +0000 (0:00:00.014) 0:00:49.247 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:58 Sunday 28 January 2024 06:00:24 +0000 (0:00:00.014) 0:00:49.262 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:71 Sunday 28 January 2024 06:00:24 +0000 (0:00:00.012) 0:00:49.274 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:3 Sunday 28 January 2024 06:00:24 +0000 (0:00:00.012) 0:00:49.287 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:12 Sunday 28 January 2024 06:00:24 +0000 (0:00:00.015) 0:00:49.302 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:3 Sunday 28 January 2024 06:00:24 +0000 (0:00:00.016) 0:00:49.319 ******** ok: [sut] => { "changed": false, "stat": { "atime": 1706421619.1827211, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1706421616.8768034, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 817, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1706421616.8768034, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:9 Sunday 28 January 2024 06:00:24 +0000 (0:00:00.205) 0:00:49.525 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:16 Sunday 28 January 2024 06:00:24 +0000 (0:00:00.015) 0:00:49.540 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:24 Sunday 28 January 2024 06:00:24 +0000 (0:00:00.012) 0:00:49.553 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:30 Sunday 28 January 2024 06:00:24 +0000 (0:00:00.013) 0:00:49.566 ******** ok: [sut] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:34 Sunday 28 January 2024 06:00:24 +0000 (0:00:00.012) 0:00:49.578 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:39 Sunday 28 January 2024 06:00:24 +0000 (0:00:00.012) 0:00:49.590 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:3 Sunday 28 January 2024 06:00:24 +0000 (0:00:00.013) 0:00:49.604 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:10 Sunday 28 January 2024 06:00:24 +0000 (0:00:00.011) 0:00:49.615 ******** changed: [sut] => { "changed": true, "rc": 0, "results": [ "Installed: cryptsetup-2.6.1-1.fc38.x86_64" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:17 Sunday 28 January 2024 06:00:27 +0000 (0:00:03.476) 0:00:53.092 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:23 Sunday 28 January 2024 06:00:27 +0000 (0:00:00.012) 0:00:53.105 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:32 Sunday 28 January 2024 06:00:27 +0000 (0:00:00.011) 0:00:53.116 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:45 Sunday 28 January 2024 06:00:27 +0000 (0:00:00.016) 0:00:53.133 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:51 Sunday 28 January 2024 06:00:27 +0000 (0:00:00.011) 0:00:53.144 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:56 Sunday 28 January 2024 06:00:27 +0000 (0:00:00.011) 0:00:53.155 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:69 Sunday 28 January 2024 06:00:27 +0000 (0:00:00.011) 0:00:53.166 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:81 Sunday 28 January 2024 06:00:27 +0000 (0:00:00.012) 0:00:53.179 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:94 Sunday 28 January 2024 06:00:27 +0000 (0:00:00.011) 0:00:53.190 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:106 Sunday 28 January 2024 06:00:27 +0000 (0:00:00.015) 0:00:53.206 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:114 Sunday 28 January 2024 06:00:27 +0000 (0:00:00.013) 0:00:53.219 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:122 Sunday 28 January 2024 06:00:27 +0000 (0:00:00.011) 0:00:53.230 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:131 Sunday 28 January 2024 06:00:27 +0000 (0:00:00.011) 0:00:53.242 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:140 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.012) 0:00:53.254 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:8 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.011) 0:00:53.265 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:14 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.011) 0:00:53.276 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:21 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.011) 0:00:53.287 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:28 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.011) 0:00:53.299 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:35 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.011) 0:00:53.310 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:45 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.012) 0:00:53.322 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:54 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.012) 0:00:53.334 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:63 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.011) 0:00:53.346 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:72 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.011) 0:00:53.357 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:81 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.011) 0:00:53.368 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:3 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.011) 0:00:53.379 ******** ok: [sut] => { "bytes": 6442450944, "changed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:11 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.272) 0:00:53.651 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:20 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.015) 0:00:53.666 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:28 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.013) 0:00:53.680 ******** ok: [sut] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:32 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.012) 0:00:53.692 ******** ok: [sut] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:46 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.200) 0:00:53.893 ******** ok: [sut] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "40%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } } TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:50 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.018) 0:00:53.911 ******** ok: [sut] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "e0f34bf1-1ed8-4ab6-886d-2081993af588" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "535527a1-073c-487e-98a8-36b792e0b690" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "urYyra-dHsd-6wqb-JL2Q-5qRG-9e7s-c6khtG" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "9db35ec2-66ac-4531-8ad6-ffb8154c9c87" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } } TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:54 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.017) 0:00:53.928 ******** ok: [sut] => { "storage_test_pool_size": { "bytes": 10726680821, "changed": false, "failed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:58 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.015) 0:00:53.944 ******** ok: [sut] => { "ansible_facts": { "storage_test_expected_size": "6436008492.599999" }, "changed": false } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:68 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.016) 0:00:53.960 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:72 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.012) 0:00:53.973 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:77 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.012) 0:00:53.985 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:83 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.011) 0:00:53.997 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:88 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.011) 0:00:54.008 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:96 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.011) 0:00:54.019 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:104 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.011) 0:00:54.030 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:109 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.015) 0:00:54.045 ******** skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:113 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.012) 0:00:54.058 ******** skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:117 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.012) 0:00:54.071 ******** skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:121 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.012) 0:00:54.084 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:129 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.011) 0:00:54.095 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:138 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.011) 0:00:54.107 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:142 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.013) 0:00:54.120 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:150 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.027) 0:00:54.148 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:156 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.012) 0:00:54.161 ******** ok: [sut] => { "storage_test_actual_size": { "bytes": 6442450944, "changed": false, "failed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:160 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.013) 0:00:54.174 ******** ok: [sut] => { "storage_test_expected_size": "6436008492.599999" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:164 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.012) 0:00:54.187 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:5 Sunday 28 January 2024 06:00:28 +0000 (0:00:00.016) 0:00:54.203 ******** ok: [sut] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.036909", "end": "2024-01-28 06:00:29.173027", "rc": 0, "start": "2024-01-28 06:00:29.136118" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:13 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.237) 0:00:54.441 ******** ok: [sut] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:18 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.015) 0:00:54.457 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:27 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.015) 0:00:54.473 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:35 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.012) 0:00:54.485 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:41 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.012) 0:00:54.498 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:47 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.012) 0:00:54.510 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:27 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.012) 0:00:54.522 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:2 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.011) 0:00:54.534 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:21 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.014) 0:00:54.548 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:7 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.057) 0:00:54.605 ******** ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:16 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.015) 0:00:54.620 ******** ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 954137, "block_size": 4096, "block_total": 1010616, "block_used": 56479, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime,stripe=2048", "size_available": 3908145152, "size_total": 4139483136, "uuid": "535527a1-073c-487e-98a8-36b792e0b690" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 954137, "block_size": 4096, "block_total": 1010616, "block_used": 56479, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime,stripe=2048", "size_available": 3908145152, "size_total": 4139483136, "uuid": "535527a1-073c-487e-98a8-36b792e0b690" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:38 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.018) 0:00:54.638 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:51 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.012) 0:00:54.651 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:63 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.015) 0:00:54.666 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:71 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.013) 0:00:54.680 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:83 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.011) 0:00:54.691 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:95 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.012) 0:00:54.703 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:110 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.012) 0:00:54.716 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:122 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.016) 0:00:54.732 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:128 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.013) 0:00:54.745 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:134 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.013) 0:00:54.759 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:146 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.012) 0:00:54.772 ******** ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:2 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.011) 0:00:54.783 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:40 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.026) 0:00:54.810 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:48 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.016) 0:00:54.826 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:58 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.016) 0:00:54.842 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:71 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.012) 0:00:54.855 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:3 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.011) 0:00:54.866 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:12 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.017) 0:00:54.883 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:3 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.018) 0:00:54.902 ******** ok: [sut] => { "changed": false, "stat": { "atime": 1706421619.2167199, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1706421616.4598181, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 776, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1706421616.4598181, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:9 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.206) 0:00:55.109 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:16 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.016) 0:00:55.125 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:24 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.012) 0:00:55.138 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:30 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.015) 0:00:55.154 ******** ok: [sut] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:34 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.014) 0:00:55.168 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:39 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.012) 0:00:55.180 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:3 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.016) 0:00:55.196 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:10 Sunday 28 January 2024 06:00:29 +0000 (0:00:00.011) 0:00:55.208 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:17 Sunday 28 January 2024 06:00:32 +0000 (0:00:02.437) 0:00:57.646 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:23 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.014) 0:00:57.660 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:32 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.013) 0:00:57.673 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:45 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.058) 0:00:57.732 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:51 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.012) 0:00:57.745 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:56 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.012) 0:00:57.757 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:69 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.011) 0:00:57.769 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:81 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.011) 0:00:57.780 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:94 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.012) 0:00:57.792 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:106 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.015) 0:00:57.808 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:114 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.014) 0:00:57.823 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:122 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.011) 0:00:57.834 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:131 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.011) 0:00:57.845 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:140 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.011) 0:00:57.856 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:8 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.012) 0:00:57.869 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:14 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.012) 0:00:57.881 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:21 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.011) 0:00:57.892 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:28 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.011) 0:00:57.903 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:35 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.011) 0:00:57.915 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:45 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.011) 0:00:57.926 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:54 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.012) 0:00:57.938 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:63 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.011) 0:00:57.949 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:72 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.011) 0:00:57.961 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:81 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.011) 0:00:57.972 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:3 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.011) 0:00:57.983 ******** ok: [sut] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:11 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.201) 0:00:58.185 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:20 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.014) 0:00:58.200 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:28 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.013) 0:00:58.213 ******** ok: [sut] => { "storage_test_expected_size": "6436008492.599999" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:32 Sunday 28 January 2024 06:00:32 +0000 (0:00:00.012) 0:00:58.226 ******** ok: [sut] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:46 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.200) 0:00:58.427 ******** ok: [sut] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "40%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } } TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:50 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.018) 0:00:58.445 ******** ok: [sut] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "e0f34bf1-1ed8-4ab6-886d-2081993af588" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "535527a1-073c-487e-98a8-36b792e0b690" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "urYyra-dHsd-6wqb-JL2Q-5qRG-9e7s-c6khtG" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "9db35ec2-66ac-4531-8ad6-ffb8154c9c87" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } } TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:54 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.018) 0:00:58.464 ******** ok: [sut] => { "storage_test_pool_size": { "bytes": 10726680821, "changed": false, "failed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:58 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.017) 0:00:58.481 ******** ok: [sut] => { "ansible_facts": { "storage_test_expected_size": "4290672328.4" }, "changed": false } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:68 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.017) 0:00:58.498 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:72 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.012) 0:00:58.511 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:77 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.012) 0:00:58.524 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:83 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.012) 0:00:58.536 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:88 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.014) 0:00:58.550 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:96 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.012) 0:00:58.563 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:104 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.012) 0:00:58.576 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:109 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.011) 0:00:58.587 ******** skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:113 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.012) 0:00:58.600 ******** skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:117 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.012) 0:00:58.612 ******** skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:121 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.013) 0:00:58.626 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:129 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.014) 0:00:58.641 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:138 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.014) 0:00:58.655 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:142 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.019) 0:00:58.675 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:150 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.015) 0:00:58.691 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:156 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.012) 0:00:58.703 ******** ok: [sut] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:160 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.015) 0:00:58.719 ******** ok: [sut] => { "storage_test_expected_size": "4290672328.4" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:164 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.013) 0:00:58.733 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:5 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.018) 0:00:58.751 ******** ok: [sut] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.033300", "end": "2024-01-28 06:00:33.718614", "rc": 0, "start": "2024-01-28 06:00:33.685314" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:13 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.236) 0:00:58.988 ******** ok: [sut] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:18 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.016) 0:00:59.004 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:27 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.017) 0:00:59.021 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:35 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.013) 0:00:59.035 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:41 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.012) 0:00:59.047 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:47 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.014) 0:00:59.062 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:27 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.030) 0:00:59.092 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:44 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.011) 0:00:59.104 ******** TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:54 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.010) 0:00:59.114 ******** ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:65 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.011) 0:00:59.125 ******** TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.027) 0:00:59.152 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.016) 0:00:59.169 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.013) 0:00:59.182 ******** skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_38.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_38.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_38.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_38.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.031) 0:00:59.214 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.011) 0:00:59.225 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 28 January 2024 06:00:33 +0000 (0:00:00.012) 0:00:59.238 ******** ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 28 January 2024 06:00:34 +0000 (0:00:00.011) 0:00:59.249 ******** ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 28 January 2024 06:00:34 +0000 (0:00:00.011) 0:00:59.261 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 28 January 2024 06:00:34 +0000 (0:00:00.025) 0:00:59.287 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 28 January 2024 06:00:36 +0000 (0:00:02.371) 0:01:01.659 ******** ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "60%" }, { "mount_point": "/opt/test2", "name": "test2", "size": "40%" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 28 January 2024 06:00:36 +0000 (0:00:00.018) 0:01:01.677 ******** ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 28 January 2024 06:00:36 +0000 (0:00:00.014) 0:01:01.692 ******** ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 28 January 2024 06:00:38 +0000 (0:00:02.227) 0:01:03.920 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 28 January 2024 06:00:38 +0000 (0:00:00.025) 0:01:03.945 ******** skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 28 January 2024 06:00:38 +0000 (0:00:00.020) 0:01:03.966 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 28 January 2024 06:00:38 +0000 (0:00:00.013) 0:01:03.979 ******** skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 28 January 2024 06:00:38 +0000 (0:00:00.019) 0:01:03.999 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 28 January 2024 06:00:41 +0000 (0:00:02.400) 0:01:06.399 ******** ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.home1.service": { "name": "dbus-org.freedesktop.home1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "ssh-host-keys-migration.service": { "name": "ssh-host-keys-migration.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 28 January 2024 06:00:43 +0000 (0:00:02.213) 0:01:08.613 ******** ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 28 January 2024 06:00:43 +0000 (0:00:00.022) 0:01:08.635 ******** TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 28 January 2024 06:00:43 +0000 (0:00:00.041) 0:01:08.676 ******** ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "e2fsprogs", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "40%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 28 January 2024 06:00:45 +0000 (0:00:02.493) 0:01:11.170 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 28 January 2024 06:00:45 +0000 (0:00:00.012) 0:01:11.183 ******** TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 28 January 2024 06:00:45 +0000 (0:00:00.010) 0:01:11.194 ******** ok: [sut] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "e2fsprogs", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "40%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 28 January 2024 06:00:45 +0000 (0:00:00.015) 0:01:11.209 ******** ok: [sut] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "40%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:129 Sunday 28 January 2024 06:00:45 +0000 (0:00:00.015) 0:01:11.224 ******** ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:145 Sunday 28 January 2024 06:00:45 +0000 (0:00:00.014) 0:01:11.239 ******** TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:157 Sunday 28 January 2024 06:00:46 +0000 (0:00:00.011) 0:01:11.250 ******** ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:162 Sunday 28 January 2024 06:00:47 +0000 (0:00:01.001) 0:01:12.251 ******** ok: [sut] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } ok: [sut] => (item={'src': '/dev/mapper/foo-test2', 'path': '/opt/test2', 'fstype': 'ext4', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:174 Sunday 28 January 2024 06:00:47 +0000 (0:00:00.422) 0:01:12.674 ******** skipping: [sut] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [sut] => (item={'src': '/dev/mapper/foo-test2', 'path': '/opt/test2', 'fstype': 'ext4', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:189 Sunday 28 January 2024 06:00:47 +0000 (0:00:00.027) 0:01:12.701 ******** ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:197 Sunday 28 January 2024 06:00:48 +0000 (0:00:00.975) 0:01:13.676 ******** ok: [sut] => { "changed": false, "stat": { "atime": 1706421590.370748, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1705410883.285937, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131081, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1705410356.493937, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3072151005", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:202 Sunday 28 January 2024 06:00:48 +0000 (0:00:00.203) 0:01:13.879 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:224 Sunday 28 January 2024 06:00:48 +0000 (0:00:00.011) 0:01:13.891 ******** ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:80 Sunday 28 January 2024 06:00:49 +0000 (0:00:00.794) 0:01:14.686 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:2 Sunday 28 January 2024 06:00:49 +0000 (0:00:00.021) 0:01:14.708 ******** ok: [sut] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "40%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:7 Sunday 28 January 2024 06:00:49 +0000 (0:00:00.015) 0:01:14.724 ******** skipping: [sut] => {} TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:15 Sunday 28 January 2024 06:00:49 +0000 (0:00:00.011) 0:01:14.735 ******** ok: [sut] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "e0f34bf1-1ed8-4ab6-886d-2081993af588" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "535527a1-073c-487e-98a8-36b792e0b690" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "urYyra-dHsd-6wqb-JL2Q-5qRG-9e7s-c6khtG" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "9db35ec2-66ac-4531-8ad6-ffb8154c9c87" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:20 Sunday 28 January 2024 06:00:49 +0000 (0:00:00.207) 0:01:14.943 ******** ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003550", "end": "2024-01-28 06:00:49.878174", "rc": 0, "start": "2024-01-28 06:00:49.874624" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jan 16 13:05:56 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=9db35ec2-66ac-4531-8ad6-ffb8154c9c87 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/foo-test2 /opt/test2 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:25 Sunday 28 January 2024 06:00:49 +0000 (0:00:00.219) 0:01:15.162 ******** ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003566", "end": "2024-01-28 06:00:50.098345", "failed_when_result": false, "rc": 0, "start": "2024-01-28 06:00:50.094779" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:34 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.204) 0:01:15.367 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml for sut TASK [Set _storage_pool_tests] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:5 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.031) 0:01:15.398 ******** ok: [sut] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:18 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.011) 0:01:15.410 ******** ok: [sut] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.037287", "end": "2024-01-28 06:00:50.382394", "rc": 0, "start": "2024-01-28 06:00:50.345107" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:24 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.240) 0:01:15.650 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:34 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.018) 0:01:15.669 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-volumes.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:2 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.026) 0:01:15.695 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:13 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.016) 0:01:15.711 ******** ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:22 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.201) 0:01:15.913 ******** ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:27 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.015) 0:01:15.928 ******** ok: [sut] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:33 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.014) 0:01:15.943 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:42 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.015) 0:01:15.958 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:48 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.013) 0:01:15.971 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:54 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.014) 0:01:15.986 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:59 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.011) 0:01:15.997 ******** ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:73 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.020) 0:01:16.018 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml for sut TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:8 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.021) 0:01:16.040 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:14 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.012) 0:01:16.052 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:21 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.011) 0:01:16.064 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:28 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.011) 0:01:16.075 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:35 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.011) 0:01:16.086 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:45 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.011) 0:01:16.097 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:54 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.011) 0:01:16.108 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:64 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.011) 0:01:16.120 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:74 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.012) 0:01:16.132 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:85 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.011) 0:01:16.143 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:95 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.011) 0:01:16.154 ******** ok: [sut] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:76 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.011) 0:01:16.165 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-lvmraid.yml for sut TASK [Validate pool member LVM RAID settings] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-lvmraid.yml:2 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.024) 0:01:16.189 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml for sut TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:8 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.028) 0:01:16.218 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:16 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.011) 0:01:16.230 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:21 Sunday 28 January 2024 06:00:50 +0000 (0:00:00.011) 0:01:16.241 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:29 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.253 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:34 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.264 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:40 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.275 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:46 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.286 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:8 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.298 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:16 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.012) 0:01:16.310 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:21 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.012) 0:01:16.322 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:29 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.334 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:34 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.345 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:40 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.356 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:46 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.367 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:79 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.010) 0:01:16.378 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-thin.yml for sut TASK [Validate pool member thinpool settings] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-thin.yml:2 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.041) 0:01:16.419 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml for sut TASK [Get information about thinpool] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:8 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.030) 0:01:16.450 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:16 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.012) 0:01:16.462 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:23 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.473 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:27 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.012) 0:01:16.486 ******** ok: [sut] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Get information about thinpool] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:8 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.497 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:16 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.508 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:23 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.520 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:27 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.013) 0:01:16.533 ******** ok: [sut] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:82 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.544 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:5 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.024) 0:01:16.569 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:13 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.013) 0:01:16.583 ******** skipping: [sut] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:20 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.014) 0:01:16.597 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml for sut TASK [Set variables used by tests] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:2 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.021) 0:01:16.619 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:9 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.013) 0:01:16.633 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:18 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.014) 0:01:16.647 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:27 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.658 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:37 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.012) 0:01:16.670 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:47 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.682 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:27 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.693 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:85 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.704 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-vdo.yml for sut TASK [Validate pool member VDO settings] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-vdo.yml:2 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.027) 0:01:16.732 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml for sut TASK [Get information about VDO deduplication] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:9 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.032) 0:01:16.764 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:16 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.013) 0:01:16.777 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:22 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.013) 0:01:16.791 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:28 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.012) 0:01:16.804 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:35 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.013) 0:01:16.817 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:41 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.012) 0:01:16.830 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:47 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.016) 0:01:16.847 ******** ok: [sut] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Get information about VDO deduplication] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:9 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.012) 0:01:16.859 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:16 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.014) 0:01:16.873 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:22 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.012) 0:01:16.886 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:28 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.012) 0:01:16.899 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:35 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.013) 0:01:16.912 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:41 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.012) 0:01:16.924 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:47 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.012) 0:01:16.937 ******** ok: [sut] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:88 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.017) 0:01:16.955 ******** ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-volumes.yml:3 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:16.967 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:2 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.028) 0:01:16.995 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:21 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.017) 0:01:17.013 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:7 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.056) 0:01:17.069 ******** ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:16 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.015) 0:01:17.084 ******** ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1536347, "block_size": 4096, "block_total": 1555456, "block_used": 19109, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3143677, "inode_total": 3143680, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6292877312, "size_total": 6371147776, "uuid": "e0f34bf1-1ed8-4ab6-886d-2081993af588" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1536347, "block_size": 4096, "block_total": 1555456, "block_used": 19109, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3143677, "inode_total": 3143680, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6292877312, "size_total": 6371147776, "uuid": "e0f34bf1-1ed8-4ab6-886d-2081993af588" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:38 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.019) 0:01:17.104 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:51 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.036) 0:01:17.140 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:63 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.016) 0:01:17.157 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:71 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.014) 0:01:17.172 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:83 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.012) 0:01:17.184 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:95 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:17.195 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:110 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:17.207 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:122 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.016) 0:01:17.224 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:128 Sunday 28 January 2024 06:00:51 +0000 (0:00:00.011) 0:01:17.235 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:134 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.011) 0:01:17.246 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:146 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.011) 0:01:17.257 ******** ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:2 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.011) 0:01:17.269 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:40 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.025) 0:01:17.294 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:48 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.019) 0:01:17.314 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:58 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.016) 0:01:17.331 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:71 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.016) 0:01:17.348 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:3 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.018) 0:01:17.366 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:12 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.025) 0:01:17.392 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:3 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.019) 0:01:17.411 ******** ok: [sut] => { "changed": false, "stat": { "atime": 1706421619.1827211, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1706421616.8768034, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 817, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1706421616.8768034, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:9 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.211) 0:01:17.623 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:16 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.018) 0:01:17.641 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:24 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.014) 0:01:17.655 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:30 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.026) 0:01:17.682 ******** ok: [sut] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:34 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.016) 0:01:17.699 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:39 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.023) 0:01:17.722 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:3 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.015) 0:01:17.738 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:10 Sunday 28 January 2024 06:00:52 +0000 (0:00:00.016) 0:01:17.754 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:17 Sunday 28 January 2024 06:00:54 +0000 (0:00:02.408) 0:01:20.162 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:23 Sunday 28 January 2024 06:00:54 +0000 (0:00:00.012) 0:01:20.175 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:32 Sunday 28 January 2024 06:00:54 +0000 (0:00:00.011) 0:01:20.187 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:45 Sunday 28 January 2024 06:00:54 +0000 (0:00:00.018) 0:01:20.205 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:51 Sunday 28 January 2024 06:00:54 +0000 (0:00:00.011) 0:01:20.217 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:56 Sunday 28 January 2024 06:00:54 +0000 (0:00:00.011) 0:01:20.228 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:69 Sunday 28 January 2024 06:00:54 +0000 (0:00:00.011) 0:01:20.239 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:81 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.011) 0:01:20.251 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:94 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.011) 0:01:20.262 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:106 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.017) 0:01:20.279 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:114 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.015) 0:01:20.294 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:122 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.012) 0:01:20.307 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:131 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.012) 0:01:20.320 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:140 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.012) 0:01:20.332 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:8 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.011) 0:01:20.344 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:14 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.013) 0:01:20.357 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:21 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.012) 0:01:20.370 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:28 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.012) 0:01:20.383 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:35 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.012) 0:01:20.395 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:45 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.011) 0:01:20.407 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:54 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.012) 0:01:20.420 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:63 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.013) 0:01:20.434 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:72 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.021) 0:01:20.456 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:81 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.013) 0:01:20.469 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:3 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.012) 0:01:20.481 ******** ok: [sut] => { "bytes": 6442450944, "changed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:11 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.196) 0:01:20.678 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:20 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.015) 0:01:20.693 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:28 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.013) 0:01:20.707 ******** ok: [sut] => { "storage_test_expected_size": "4290672328.4" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:32 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.012) 0:01:20.719 ******** ok: [sut] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:46 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.196) 0:01:20.916 ******** ok: [sut] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "40%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } } TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:50 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.018) 0:01:20.935 ******** ok: [sut] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "e0f34bf1-1ed8-4ab6-886d-2081993af588" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "535527a1-073c-487e-98a8-36b792e0b690" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "urYyra-dHsd-6wqb-JL2Q-5qRG-9e7s-c6khtG" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "9db35ec2-66ac-4531-8ad6-ffb8154c9c87" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } } TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:54 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.019) 0:01:20.954 ******** ok: [sut] => { "storage_test_pool_size": { "bytes": 10726680821, "changed": false, "failed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:58 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.017) 0:01:20.971 ******** ok: [sut] => { "ansible_facts": { "storage_test_expected_size": "6436008492.599999" }, "changed": false } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:68 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.016) 0:01:20.988 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:72 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.011) 0:01:20.999 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:77 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.011) 0:01:21.010 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:83 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.011) 0:01:21.021 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:88 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.012) 0:01:21.034 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:96 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.030) 0:01:21.064 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:104 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.013) 0:01:21.078 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:109 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.011) 0:01:21.089 ******** skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:113 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.011) 0:01:21.100 ******** skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:117 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.011) 0:01:21.111 ******** skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:121 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.011) 0:01:21.123 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:129 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.012) 0:01:21.135 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:138 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.011) 0:01:21.146 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:142 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.011) 0:01:21.158 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:150 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.011) 0:01:21.169 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:156 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.011) 0:01:21.180 ******** ok: [sut] => { "storage_test_actual_size": { "bytes": 6442450944, "changed": false, "failed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:160 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.013) 0:01:21.194 ******** ok: [sut] => { "storage_test_expected_size": "6436008492.599999" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:164 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.013) 0:01:21.207 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:5 Sunday 28 January 2024 06:00:55 +0000 (0:00:00.016) 0:01:21.224 ******** ok: [sut] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.040406", "end": "2024-01-28 06:00:56.200823", "rc": 0, "start": "2024-01-28 06:00:56.160417" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:13 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.245) 0:01:21.469 ******** ok: [sut] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:18 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.016) 0:01:21.485 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:27 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.015) 0:01:21.501 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:35 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.012) 0:01:21.514 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:41 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.012) 0:01:21.526 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:47 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.012) 0:01:21.539 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:27 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.014) 0:01:21.554 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:2 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.011) 0:01:21.565 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:21 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.014) 0:01:21.579 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:7 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.054) 0:01:21.634 ******** ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:16 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.014) 0:01:21.649 ******** ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 954137, "block_size": 4096, "block_total": 1010616, "block_used": 56479, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime,stripe=2048", "size_available": 3908145152, "size_total": 4139483136, "uuid": "535527a1-073c-487e-98a8-36b792e0b690" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 954137, "block_size": 4096, "block_total": 1010616, "block_used": 56479, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 262133, "inode_total": 262144, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime,stripe=2048", "size_available": 3908145152, "size_total": 4139483136, "uuid": "535527a1-073c-487e-98a8-36b792e0b690" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:38 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.019) 0:01:21.668 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:51 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.012) 0:01:21.680 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:63 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.014) 0:01:21.695 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:71 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.013) 0:01:21.709 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:83 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.012) 0:01:21.721 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:95 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.011) 0:01:21.732 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:110 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.011) 0:01:21.743 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:122 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.015) 0:01:21.759 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:128 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.011) 0:01:21.770 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:134 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.011) 0:01:21.781 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:146 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.012) 0:01:21.793 ******** ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:2 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.011) 0:01:21.805 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:40 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.024) 0:01:21.829 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:48 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.014) 0:01:21.843 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:58 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.014) 0:01:21.858 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:71 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.011) 0:01:21.870 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:3 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.014) 0:01:21.884 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:12 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.016) 0:01:21.901 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:3 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.016) 0:01:21.918 ******** ok: [sut] => { "changed": false, "stat": { "atime": 1706421645.8237715, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1706421645.813772, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 776, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1706421645.813772, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:9 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.209) 0:01:22.128 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:16 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.015) 0:01:22.144 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:24 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.012) 0:01:22.156 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:30 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.014) 0:01:22.171 ******** ok: [sut] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:34 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.012) 0:01:22.183 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:39 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.012) 0:01:22.195 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:3 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.033) 0:01:22.229 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:10 Sunday 28 January 2024 06:00:56 +0000 (0:00:00.013) 0:01:22.242 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:17 Sunday 28 January 2024 06:00:59 +0000 (0:00:02.386) 0:01:24.628 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:23 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.012) 0:01:24.641 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:32 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.011) 0:01:24.653 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:45 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.016) 0:01:24.670 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:51 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.012) 0:01:24.682 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:56 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.011) 0:01:24.693 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:69 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.012) 0:01:24.706 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:81 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.011) 0:01:24.717 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:94 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.011) 0:01:24.728 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:106 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.015) 0:01:24.744 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:114 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.014) 0:01:24.759 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:122 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.011) 0:01:24.770 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:131 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.012) 0:01:24.783 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:140 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.011) 0:01:24.794 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:8 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.011) 0:01:24.805 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:14 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.011) 0:01:24.817 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:21 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.011) 0:01:24.828 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:28 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.011) 0:01:24.839 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:35 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.012) 0:01:24.851 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:45 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.011) 0:01:24.863 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:54 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.011) 0:01:24.874 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:63 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.011) 0:01:24.885 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:72 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.011) 0:01:24.896 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:81 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.011) 0:01:24.907 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:3 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.012) 0:01:24.920 ******** ok: [sut] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:11 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.199) 0:01:25.119 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:20 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.014) 0:01:25.134 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:28 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.014) 0:01:25.148 ******** ok: [sut] => { "storage_test_expected_size": "6436008492.599999" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:32 Sunday 28 January 2024 06:00:59 +0000 (0:00:00.012) 0:01:25.160 ******** ok: [sut] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:46 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.197) 0:01:25.358 ******** ok: [sut] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "40%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } } TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:50 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.019) 0:01:25.378 ******** ok: [sut] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "e0f34bf1-1ed8-4ab6-886d-2081993af588" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "4G", "type": "lvm", "uuid": "535527a1-073c-487e-98a8-36b792e0b690" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "urYyra-dHsd-6wqb-JL2Q-5qRG-9e7s-c6khtG" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "9db35ec2-66ac-4531-8ad6-ffb8154c9c87" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } } TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:54 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.020) 0:01:25.398 ******** ok: [sut] => { "storage_test_pool_size": { "bytes": 10726680821, "changed": false, "failed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:58 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.017) 0:01:25.416 ******** ok: [sut] => { "ansible_facts": { "storage_test_expected_size": "4290672328.4" }, "changed": false } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:68 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.017) 0:01:25.433 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:72 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.012) 0:01:25.446 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:77 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.013) 0:01:25.460 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:83 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.011) 0:01:25.471 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:88 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.011) 0:01:25.482 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:96 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.011) 0:01:25.494 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:104 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.011) 0:01:25.505 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:109 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.011) 0:01:25.516 ******** skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:113 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.012) 0:01:25.528 ******** skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:117 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.028) 0:01:25.557 ******** skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:121 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.012) 0:01:25.570 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:129 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.011) 0:01:25.581 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:138 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.011) 0:01:25.593 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:142 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.011) 0:01:25.604 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:150 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.011) 0:01:25.615 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:156 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.012) 0:01:25.628 ******** ok: [sut] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:160 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.013) 0:01:25.641 ******** ok: [sut] => { "storage_test_expected_size": "4290672328.4" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:164 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.012) 0:01:25.654 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:5 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.017) 0:01:25.671 ******** ok: [sut] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.037640", "end": "2024-01-28 06:01:00.644531", "rc": 0, "start": "2024-01-28 06:01:00.606891" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:13 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.241) 0:01:25.912 ******** ok: [sut] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:18 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.016) 0:01:25.928 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:27 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.015) 0:01:25.944 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:35 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.013) 0:01:25.958 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:41 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.013) 0:01:25.972 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:47 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.013) 0:01:25.985 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:27 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.012) 0:01:25.997 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:44 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.011) 0:01:26.009 ******** TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:54 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.010) 0:01:26.019 ******** ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Shrink test2 volume via percentage-based size spec] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:83 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.011) 0:01:26.030 ******** TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.030) 0:01:26.061 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.017) 0:01:26.078 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.014) 0:01:26.093 ******** skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_38.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_38.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_38.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_38.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.031) 0:01:26.125 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.012) 0:01:26.137 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.012) 0:01:26.150 ******** ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.013) 0:01:26.163 ******** ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.011) 0:01:26.175 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 28 January 2024 06:01:00 +0000 (0:00:00.024) 0:01:26.199 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 28 January 2024 06:01:03 +0000 (0:00:02.439) 0:01:28.639 ******** ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "60%" }, { "mount_point": "/opt/test2", "name": "test2", "size": "25%" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 28 January 2024 06:01:03 +0000 (0:00:00.016) 0:01:28.655 ******** ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 28 January 2024 06:01:03 +0000 (0:00:00.013) 0:01:28.669 ******** ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 28 January 2024 06:01:05 +0000 (0:00:02.217) 0:01:30.886 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 28 January 2024 06:01:05 +0000 (0:00:00.022) 0:01:30.908 ******** skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 28 January 2024 06:01:05 +0000 (0:00:00.018) 0:01:30.926 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 28 January 2024 06:01:05 +0000 (0:00:00.014) 0:01:30.941 ******** skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 28 January 2024 06:01:05 +0000 (0:00:00.062) 0:01:31.004 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 28 January 2024 06:01:08 +0000 (0:00:02.388) 0:01:33.393 ******** ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.home1.service": { "name": "dbus-org.freedesktop.home1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "ssh-host-keys-migration.service": { "name": "ssh-host-keys-migration.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 28 January 2024 06:01:10 +0000 (0:00:02.169) 0:01:35.562 ******** ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 28 January 2024 06:01:10 +0000 (0:00:00.020) 0:01:35.583 ******** TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 28 January 2024 06:01:10 +0000 (0:00:00.015) 0:01:35.598 ******** changed: [sut] => { "actions": [ { "action": "resize format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "resize device", "device": "/dev/mapper/foo-test2", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "25%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 28 January 2024 06:01:13 +0000 (0:00:02.957) 0:01:38.556 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 28 January 2024 06:01:13 +0000 (0:00:00.012) 0:01:38.569 ******** TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 28 January 2024 06:01:13 +0000 (0:00:00.010) 0:01:38.579 ******** ok: [sut] => { "blivet_output": { "actions": [ { "action": "resize format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "resize device", "device": "/dev/mapper/foo-test2", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/mapper/foo-test2", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "25%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 28 January 2024 06:01:13 +0000 (0:00:00.015) 0:01:38.594 ******** ok: [sut] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "25%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:129 Sunday 28 January 2024 06:01:13 +0000 (0:00:00.015) 0:01:38.610 ******** ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:145 Sunday 28 January 2024 06:01:13 +0000 (0:00:00.014) 0:01:38.625 ******** TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:157 Sunday 28 January 2024 06:01:13 +0000 (0:00:00.011) 0:01:38.636 ******** ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:162 Sunday 28 January 2024 06:01:14 +0000 (0:00:00.996) 0:01:39.633 ******** ok: [sut] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } changed: [sut] => (item={'src': '/dev/mapper/foo-test2', 'path': '/opt/test2', 'fstype': 'ext4', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:174 Sunday 28 January 2024 06:01:14 +0000 (0:00:00.422) 0:01:40.055 ******** skipping: [sut] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [sut] => (item={'src': '/dev/mapper/foo-test2', 'path': '/opt/test2', 'fstype': 'ext4', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:189 Sunday 28 January 2024 06:01:14 +0000 (0:00:00.020) 0:01:40.075 ******** ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:197 Sunday 28 January 2024 06:01:15 +0000 (0:00:00.970) 0:01:41.046 ******** ok: [sut] => { "changed": false, "stat": { "atime": 1706421590.370748, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1705410883.285937, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131081, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1705410356.493937, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3072151005", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:202 Sunday 28 January 2024 06:01:16 +0000 (0:00:00.208) 0:01:41.254 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:224 Sunday 28 January 2024 06:01:16 +0000 (0:00:00.012) 0:01:41.267 ******** ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:98 Sunday 28 January 2024 06:01:16 +0000 (0:00:00.781) 0:01:42.048 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:2 Sunday 28 January 2024 06:01:16 +0000 (0:00:00.041) 0:01:42.090 ******** ok: [sut] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "25%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:7 Sunday 28 January 2024 06:01:16 +0000 (0:00:00.016) 0:01:42.106 ******** skipping: [sut] => {} TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:15 Sunday 28 January 2024 06:01:16 +0000 (0:00:00.011) 0:01:42.118 ******** ok: [sut] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "e0f34bf1-1ed8-4ab6-886d-2081993af588" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "2.5G", "type": "lvm", "uuid": "535527a1-073c-487e-98a8-36b792e0b690" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "urYyra-dHsd-6wqb-JL2Q-5qRG-9e7s-c6khtG" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "9db35ec2-66ac-4531-8ad6-ffb8154c9c87" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:20 Sunday 28 January 2024 06:01:17 +0000 (0:00:00.205) 0:01:42.323 ******** ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003211", "end": "2024-01-28 06:01:17.255806", "rc": 0, "start": "2024-01-28 06:01:17.252595" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jan 16 13:05:56 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=9db35ec2-66ac-4531-8ad6-ffb8154c9c87 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 /dev/mapper/foo-test2 /opt/test2 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:25 Sunday 28 January 2024 06:01:17 +0000 (0:00:00.198) 0:01:42.522 ******** ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003438", "end": "2024-01-28 06:01:17.458511", "failed_when_result": false, "rc": 0, "start": "2024-01-28 06:01:17.455073" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:34 Sunday 28 January 2024 06:01:17 +0000 (0:00:00.203) 0:01:42.725 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml for sut TASK [Set _storage_pool_tests] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:5 Sunday 28 January 2024 06:01:17 +0000 (0:00:00.025) 0:01:42.751 ******** ok: [sut] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:18 Sunday 28 January 2024 06:01:17 +0000 (0:00:00.011) 0:01:42.762 ******** ok: [sut] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.039323", "end": "2024-01-28 06:01:17.734612", "rc": 0, "start": "2024-01-28 06:01:17.695289" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:24 Sunday 28 January 2024 06:01:17 +0000 (0:00:00.239) 0:01:43.002 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:34 Sunday 28 January 2024 06:01:17 +0000 (0:00:00.018) 0:01:43.020 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-volumes.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:2 Sunday 28 January 2024 06:01:17 +0000 (0:00:00.025) 0:01:43.046 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:13 Sunday 28 January 2024 06:01:17 +0000 (0:00:00.016) 0:01:43.063 ******** ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:22 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.198) 0:01:43.261 ******** ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:27 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.015) 0:01:43.277 ******** ok: [sut] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:33 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.014) 0:01:43.291 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:42 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.015) 0:01:43.307 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:48 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.013) 0:01:43.320 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:54 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.013) 0:01:43.334 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:59 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.345 ******** ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:73 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.019) 0:01:43.364 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml for sut TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:8 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.021) 0:01:43.386 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:14 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.398 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:21 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.409 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:28 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.012) 0:01:43.421 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:35 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.432 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:45 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.443 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:54 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.455 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:64 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.466 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:74 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.477 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:85 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.012) 0:01:43.489 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:95 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.500 ******** ok: [sut] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:76 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.511 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-lvmraid.yml for sut TASK [Validate pool member LVM RAID settings] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-lvmraid.yml:2 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.023) 0:01:43.535 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml for sut TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:8 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.029) 0:01:43.564 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:16 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.576 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:21 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.587 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:29 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.598 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:34 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.030) 0:01:43.629 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:40 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.013) 0:01:43.642 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:46 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.654 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:8 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.665 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:16 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.676 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:21 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.688 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:29 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.012) 0:01:43.701 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:34 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.712 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:40 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.723 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:46 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.734 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:79 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.746 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-thin.yml for sut TASK [Validate pool member thinpool settings] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-thin.yml:2 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.024) 0:01:43.771 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml for sut TASK [Get information about thinpool] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:8 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.029) 0:01:43.800 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:16 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.811 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:23 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.823 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:27 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.834 ******** ok: [sut] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Get information about thinpool] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:8 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.846 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:16 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.857 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:23 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.868 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:27 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:43.879 ******** ok: [sut] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:82 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.012) 0:01:43.892 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:5 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.024) 0:01:43.917 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:13 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.014) 0:01:43.931 ******** skipping: [sut] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:20 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.013) 0:01:43.945 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml for sut TASK [Set variables used by tests] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:2 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.024) 0:01:43.969 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:9 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.015) 0:01:43.985 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:18 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.016) 0:01:44.001 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:27 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.012) 0:01:44.013 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:37 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.012) 0:01:44.026 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:47 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.012) 0:01:44.039 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:27 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:44.051 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:85 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:44.063 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-vdo.yml for sut TASK [Validate pool member VDO settings] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-vdo.yml:2 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.028) 0:01:44.092 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml for sut TASK [Get information about VDO deduplication] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:9 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.032) 0:01:44.124 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:16 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.014) 0:01:44.139 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:22 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.013) 0:01:44.152 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:28 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.012) 0:01:44.165 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:35 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.014) 0:01:44.180 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:41 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.013) 0:01:44.193 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:47 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.012) 0:01:44.205 ******** ok: [sut] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Get information about VDO deduplication] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:9 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.011) 0:01:44.217 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:16 Sunday 28 January 2024 06:01:18 +0000 (0:00:00.020) 0:01:44.237 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:22 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.012) 0:01:44.250 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:28 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.012) 0:01:44.263 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:35 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.012) 0:01:44.276 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:41 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.012) 0:01:44.288 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:47 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.012) 0:01:44.301 ******** ok: [sut] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:88 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.014) 0:01:44.315 ******** ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-volumes.yml:3 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.012) 0:01:44.328 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:2 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.028) 0:01:44.357 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:21 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.016) 0:01:44.373 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:7 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.102) 0:01:44.475 ******** ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:16 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.015) 0:01:44.490 ******** ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1536347, "block_size": 4096, "block_total": 1555456, "block_used": 19109, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3143677, "inode_total": 3143680, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6292877312, "size_total": 6371147776, "uuid": "e0f34bf1-1ed8-4ab6-886d-2081993af588" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1536347, "block_size": 4096, "block_total": 1555456, "block_used": 19109, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 3143677, "inode_total": 3143680, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_available": 6292877312, "size_total": 6371147776, "uuid": "e0f34bf1-1ed8-4ab6-886d-2081993af588" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:38 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.019) 0:01:44.510 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:51 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.013) 0:01:44.523 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:63 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.015) 0:01:44.539 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:71 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.015) 0:01:44.554 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:83 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.013) 0:01:44.567 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:95 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.012) 0:01:44.580 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:110 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.012) 0:01:44.593 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:122 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.018) 0:01:44.611 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:128 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.012) 0:01:44.623 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:134 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.013) 0:01:44.637 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:146 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.011) 0:01:44.648 ******** ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:2 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.011) 0:01:44.660 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:40 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.024) 0:01:44.684 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:48 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.015) 0:01:44.700 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:58 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.014) 0:01:44.715 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:71 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.011) 0:01:44.726 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:3 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.011) 0:01:44.738 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:12 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.015) 0:01:44.753 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:3 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.016) 0:01:44.770 ******** ok: [sut] => { "changed": false, "stat": { "atime": 1706421619.1827211, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1706421616.8768034, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 817, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1706421616.8768034, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:9 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.209) 0:01:44.979 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:16 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.016) 0:01:44.996 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:24 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.013) 0:01:45.010 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:30 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.015) 0:01:45.025 ******** ok: [sut] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:34 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.014) 0:01:45.039 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:39 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.013) 0:01:45.052 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:3 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.015) 0:01:45.067 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:10 Sunday 28 January 2024 06:01:19 +0000 (0:00:00.012) 0:01:45.080 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:17 Sunday 28 January 2024 06:01:22 +0000 (0:00:02.448) 0:01:47.529 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:23 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.012) 0:01:47.541 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:32 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.011) 0:01:47.553 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:45 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.016) 0:01:47.570 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:51 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.011) 0:01:47.581 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:56 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.011) 0:01:47.592 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:69 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.011) 0:01:47.603 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:81 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.011) 0:01:47.614 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:94 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.029) 0:01:47.644 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:106 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.016) 0:01:47.661 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:114 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.014) 0:01:47.675 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:122 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.011) 0:01:47.687 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:131 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.011) 0:01:47.698 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:140 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.011) 0:01:47.709 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:8 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.012) 0:01:47.722 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:14 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.011) 0:01:47.733 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:21 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.011) 0:01:47.744 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:28 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.011) 0:01:47.756 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:35 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.011) 0:01:47.767 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:45 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.011) 0:01:47.778 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:54 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.012) 0:01:47.791 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:63 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.011) 0:01:47.802 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:72 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.011) 0:01:47.813 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:81 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.011) 0:01:47.825 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:3 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.011) 0:01:47.836 ******** ok: [sut] => { "bytes": 6442450944, "changed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:11 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.200) 0:01:48.037 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:20 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.015) 0:01:48.052 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:28 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.015) 0:01:48.068 ******** ok: [sut] => { "storage_test_expected_size": "4290672328.4" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:32 Sunday 28 January 2024 06:01:22 +0000 (0:00:00.014) 0:01:48.082 ******** ok: [sut] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:46 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.202) 0:01:48.285 ******** ok: [sut] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "25%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } } TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:50 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.019) 0:01:48.304 ******** ok: [sut] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "e0f34bf1-1ed8-4ab6-886d-2081993af588" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "2.5G", "type": "lvm", "uuid": "535527a1-073c-487e-98a8-36b792e0b690" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "urYyra-dHsd-6wqb-JL2Q-5qRG-9e7s-c6khtG" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "9db35ec2-66ac-4531-8ad6-ffb8154c9c87" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } } TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:54 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.019) 0:01:48.324 ******** ok: [sut] => { "storage_test_pool_size": { "bytes": 10726680821, "changed": false, "failed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:58 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.017) 0:01:48.342 ******** ok: [sut] => { "ansible_facts": { "storage_test_expected_size": "6436008492.599999" }, "changed": false } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:68 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.018) 0:01:48.361 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:72 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.012) 0:01:48.374 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:77 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.011) 0:01:48.385 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:83 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.012) 0:01:48.398 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:88 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.014) 0:01:48.412 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:96 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.012) 0:01:48.425 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:104 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.012) 0:01:48.438 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:109 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.011) 0:01:48.450 ******** skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:113 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.011) 0:01:48.461 ******** skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:117 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.013) 0:01:48.474 ******** skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:121 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.014) 0:01:48.488 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:129 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.012) 0:01:48.501 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:138 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.011) 0:01:48.513 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:142 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.011) 0:01:48.524 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:150 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.011) 0:01:48.535 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:156 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.011) 0:01:48.547 ******** ok: [sut] => { "storage_test_actual_size": { "bytes": 6442450944, "changed": false, "failed": false, "lvm": "6g", "parted": "6GiB", "size": "6 GiB" } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:160 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.014) 0:01:48.562 ******** ok: [sut] => { "storage_test_expected_size": "6436008492.599999" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:164 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.012) 0:01:48.574 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:5 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.016) 0:01:48.591 ******** ok: [sut] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.036792", "end": "2024-01-28 06:01:23.561517", "rc": 0, "start": "2024-01-28 06:01:23.524725" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:13 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.238) 0:01:48.830 ******** ok: [sut] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:18 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.015) 0:01:48.846 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:27 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.015) 0:01:48.861 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:35 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.012) 0:01:48.874 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:41 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.012) 0:01:48.886 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:47 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.013) 0:01:48.900 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:27 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.012) 0:01:48.912 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:2 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.011) 0:01:48.923 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:21 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.014) 0:01:48.938 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:7 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.073) 0:01:49.011 ******** ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:16 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.017) 0:01:49.028 ******** ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 588749, "block_size": 4096, "block_total": 625618, "block_used": 36869, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 163829, "inode_total": 163840, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime,stripe=2048", "size_available": 2411515904, "size_total": 2562531328, "uuid": "535527a1-073c-487e-98a8-36b792e0b690" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 588749, "block_size": 4096, "block_total": 625618, "block_used": 36869, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 163829, "inode_total": 163840, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime,stripe=2048", "size_available": 2411515904, "size_total": 2562531328, "uuid": "535527a1-073c-487e-98a8-36b792e0b690" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:38 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.019) 0:01:49.048 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:51 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.012) 0:01:49.060 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:63 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.015) 0:01:49.076 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:71 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.013) 0:01:49.090 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:83 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.012) 0:01:49.102 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:95 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.011) 0:01:49.113 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:110 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.011) 0:01:49.125 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:122 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.014) 0:01:49.139 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:128 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.011) 0:01:49.151 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:134 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.011) 0:01:49.162 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:146 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.012) 0:01:49.174 ******** ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:2 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.011) 0:01:49.186 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:40 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.024) 0:01:49.210 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:48 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.014) 0:01:49.225 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:58 Sunday 28 January 2024 06:01:23 +0000 (0:00:00.014) 0:01:49.239 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:71 Sunday 28 January 2024 06:01:24 +0000 (0:00:00.011) 0:01:49.251 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:3 Sunday 28 January 2024 06:01:24 +0000 (0:00:00.012) 0:01:49.263 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:12 Sunday 28 January 2024 06:01:24 +0000 (0:00:00.015) 0:01:49.279 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:3 Sunday 28 January 2024 06:01:24 +0000 (0:00:00.018) 0:01:49.297 ******** ok: [sut] => { "changed": false, "stat": { "atime": 1706421674.7867463, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1706421673.1708035, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 776, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1706421673.1708035, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:9 Sunday 28 January 2024 06:01:24 +0000 (0:00:00.207) 0:01:49.504 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:16 Sunday 28 January 2024 06:01:24 +0000 (0:00:00.015) 0:01:49.520 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:24 Sunday 28 January 2024 06:01:24 +0000 (0:00:00.012) 0:01:49.533 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:30 Sunday 28 January 2024 06:01:24 +0000 (0:00:00.014) 0:01:49.548 ******** ok: [sut] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:34 Sunday 28 January 2024 06:01:24 +0000 (0:00:00.012) 0:01:49.561 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:39 Sunday 28 January 2024 06:01:24 +0000 (0:00:00.012) 0:01:49.573 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:3 Sunday 28 January 2024 06:01:24 +0000 (0:00:00.013) 0:01:49.587 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:10 Sunday 28 January 2024 06:01:24 +0000 (0:00:00.011) 0:01:49.598 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:17 Sunday 28 January 2024 06:01:26 +0000 (0:00:02.440) 0:01:52.039 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:23 Sunday 28 January 2024 06:01:26 +0000 (0:00:00.013) 0:01:52.052 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:32 Sunday 28 January 2024 06:01:26 +0000 (0:00:00.011) 0:01:52.063 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:45 Sunday 28 January 2024 06:01:26 +0000 (0:00:00.016) 0:01:52.080 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:51 Sunday 28 January 2024 06:01:26 +0000 (0:00:00.012) 0:01:52.092 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:56 Sunday 28 January 2024 06:01:26 +0000 (0:00:00.012) 0:01:52.105 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:69 Sunday 28 January 2024 06:01:26 +0000 (0:00:00.011) 0:01:52.116 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:81 Sunday 28 January 2024 06:01:26 +0000 (0:00:00.011) 0:01:52.127 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:94 Sunday 28 January 2024 06:01:26 +0000 (0:00:00.011) 0:01:52.138 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:106 Sunday 28 January 2024 06:01:26 +0000 (0:00:00.015) 0:01:52.154 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:114 Sunday 28 January 2024 06:01:26 +0000 (0:00:00.014) 0:01:52.169 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:122 Sunday 28 January 2024 06:01:26 +0000 (0:00:00.030) 0:01:52.199 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:131 Sunday 28 January 2024 06:01:26 +0000 (0:00:00.013) 0:01:52.212 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:140 Sunday 28 January 2024 06:01:26 +0000 (0:00:00.011) 0:01:52.224 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:8 Sunday 28 January 2024 06:01:26 +0000 (0:00:00.011) 0:01:52.235 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:14 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.011) 0:01:52.246 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:21 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.011) 0:01:52.258 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:28 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.012) 0:01:52.270 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:35 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.011) 0:01:52.282 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:45 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.012) 0:01:52.294 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:54 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.012) 0:01:52.306 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:63 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.012) 0:01:52.319 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:72 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.013) 0:01:52.333 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:81 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.014) 0:01:52.348 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:3 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.014) 0:01:52.362 ******** ok: [sut] => { "bytes": 2684354560, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:11 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.204) 0:01:52.567 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:20 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.018) 0:01:52.585 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:28 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.015) 0:01:52.600 ******** ok: [sut] => { "storage_test_expected_size": "6436008492.599999" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:32 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.012) 0:01:52.613 ******** ok: [sut] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:46 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.204) 0:01:52.817 ******** ok: [sut] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "25%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } } TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:50 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.020) 0:01:52.838 ******** ok: [sut] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "6G", "type": "lvm", "uuid": "e0f34bf1-1ed8-4ab6-886d-2081993af588" }, "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "2.5G", "type": "lvm", "uuid": "535527a1-073c-487e-98a8-36b792e0b690" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "urYyra-dHsd-6wqb-JL2Q-5qRG-9e7s-c6khtG" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "9db35ec2-66ac-4531-8ad6-ffb8154c9c87" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } } TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:54 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.019) 0:01:52.857 ******** ok: [sut] => { "storage_test_pool_size": { "bytes": 10726680821, "changed": false, "failed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:58 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.017) 0:01:52.874 ******** ok: [sut] => { "ansible_facts": { "storage_test_expected_size": "2681670205.25" }, "changed": false } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:68 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.016) 0:01:52.891 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:72 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.012) 0:01:52.904 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:77 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.011) 0:01:52.916 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:83 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.011) 0:01:52.927 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:88 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.011) 0:01:52.939 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:96 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.011) 0:01:52.950 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:104 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.011) 0:01:52.962 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:109 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.013) 0:01:52.975 ******** skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:113 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.011) 0:01:52.986 ******** skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:117 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.011) 0:01:52.998 ******** skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:121 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.011) 0:01:53.009 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:129 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.011) 0:01:53.021 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:138 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.013) 0:01:53.034 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:142 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.012) 0:01:53.047 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:150 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.011) 0:01:53.058 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:156 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.011) 0:01:53.070 ******** ok: [sut] => { "storage_test_actual_size": { "bytes": 2684354560, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:160 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.013) 0:01:53.084 ******** ok: [sut] => { "storage_test_expected_size": "2681670205.25" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:164 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.012) 0:01:53.096 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:5 Sunday 28 January 2024 06:01:27 +0000 (0:00:00.016) 0:01:53.113 ******** ok: [sut] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.037626", "end": "2024-01-28 06:01:28.087138", "rc": 0, "start": "2024-01-28 06:01:28.049512" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:13 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.242) 0:01:53.356 ******** ok: [sut] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:18 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.016) 0:01:53.372 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:27 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.016) 0:01:53.388 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:35 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.012) 0:01:53.400 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:41 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.012) 0:01:53.413 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:47 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.012) 0:01:53.425 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:27 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.012) 0:01:53.438 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:44 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.011) 0:01:53.449 ******** TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:54 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.011) 0:01:53.461 ******** ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Get the size of test2 volume] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:101 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.011) 0:01:53.472 ******** ok: [sut] => { "changed": false, "cmd": [ "lsblk", "--noheadings", "-o", "SIZE", "/dev/mapper/foo-test2" ], "delta": "0:00:00.004512", "end": "2024-01-28 06:01:28.409025", "rc": 0, "start": "2024-01-28 06:01:28.404513" } STDOUT: 2.5G TASK [Remove the test1 volume without changing its size] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:106 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.204) 0:01:53.676 ******** TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.051) 0:01:53.728 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.017) 0:01:53.746 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.014) 0:01:53.761 ******** skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_38.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_38.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_38.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_38.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.031) 0:01:53.792 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.011) 0:01:53.804 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.011) 0:01:53.815 ******** ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.011) 0:01:53.826 ******** ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.012) 0:01:53.838 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 28 January 2024 06:01:28 +0000 (0:00:00.023) 0:01:53.862 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 28 January 2024 06:01:31 +0000 (0:00:02.452) 0:01:56.314 ******** ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "present", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "60%", "state": "absent" }, { "mount_point": "/opt/test2", "name": "test2", "size": "25%" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 28 January 2024 06:01:31 +0000 (0:00:00.016) 0:01:56.331 ******** ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 28 January 2024 06:01:31 +0000 (0:00:00.024) 0:01:56.355 ******** ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 28 January 2024 06:01:33 +0000 (0:00:02.213) 0:01:58.568 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 28 January 2024 06:01:33 +0000 (0:00:00.023) 0:01:58.592 ******** skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 28 January 2024 06:01:33 +0000 (0:00:00.020) 0:01:58.612 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 28 January 2024 06:01:33 +0000 (0:00:00.012) 0:01:58.625 ******** skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 28 January 2024 06:01:33 +0000 (0:00:00.018) 0:01:58.644 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 28 January 2024 06:01:35 +0000 (0:00:02.437) 0:02:01.081 ******** ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.home1.service": { "name": "dbus-org.freedesktop.home1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "ssh-host-keys-migration.service": { "name": "ssh-host-keys-migration.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 28 January 2024 06:01:38 +0000 (0:00:02.235) 0:02:03.316 ******** ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 28 January 2024 06:01:38 +0000 (0:00:00.022) 0:02:03.339 ******** TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 28 January 2024 06:01:38 +0000 (0:00:00.018) 0:02:03.358 ******** changed: [sut] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/mapper/foo-test2", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "25%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 28 January 2024 06:01:40 +0000 (0:00:02.847) 0:02:06.205 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 28 January 2024 06:01:40 +0000 (0:00:00.014) 0:02:06.219 ******** TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 28 January 2024 06:01:40 +0000 (0:00:00.011) 0:02:06.231 ******** ok: [sut] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/mapper/foo-test2", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "25%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 28 January 2024 06:01:41 +0000 (0:00:00.016) 0:02:06.248 ******** ok: [sut] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "25%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:129 Sunday 28 January 2024 06:01:41 +0000 (0:00:00.016) 0:02:06.264 ******** ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:145 Sunday 28 January 2024 06:01:41 +0000 (0:00:00.013) 0:02:06.278 ******** changed: [sut] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:157 Sunday 28 January 2024 06:01:41 +0000 (0:00:00.213) 0:02:06.491 ******** ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:162 Sunday 28 January 2024 06:01:42 +0000 (0:00:00.977) 0:02:07.469 ******** ok: [sut] => (item={'src': '/dev/mapper/foo-test2', 'path': '/opt/test2', 'fstype': 'ext4', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:174 Sunday 28 January 2024 06:01:42 +0000 (0:00:00.213) 0:02:07.682 ******** skipping: [sut] => (item={'src': '/dev/mapper/foo-test2', 'path': '/opt/test2', 'fstype': 'ext4', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:189 Sunday 28 January 2024 06:01:42 +0000 (0:00:00.018) 0:02:07.700 ******** ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:197 Sunday 28 January 2024 06:01:43 +0000 (0:00:00.995) 0:02:08.696 ******** ok: [sut] => { "changed": false, "stat": { "atime": 1706421590.370748, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1705410883.285937, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131081, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1705410356.493937, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3072151005", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:202 Sunday 28 January 2024 06:01:43 +0000 (0:00:00.213) 0:02:08.909 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:224 Sunday 28 January 2024 06:01:43 +0000 (0:00:00.011) 0:02:08.921 ******** ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:123 Sunday 28 January 2024 06:01:44 +0000 (0:00:00.786) 0:02:09.708 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:2 Sunday 28 January 2024 06:01:44 +0000 (0:00:00.024) 0:02:09.732 ******** ok: [sut] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "25%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:7 Sunday 28 January 2024 06:01:44 +0000 (0:00:00.016) 0:02:09.748 ******** skipping: [sut] => {} TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:15 Sunday 28 January 2024 06:01:44 +0000 (0:00:00.011) 0:02:09.760 ******** ok: [sut] => { "changed": false, "info": { "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "2.5G", "type": "lvm", "uuid": "535527a1-073c-487e-98a8-36b792e0b690" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "urYyra-dHsd-6wqb-JL2Q-5qRG-9e7s-c6khtG" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "9db35ec2-66ac-4531-8ad6-ffb8154c9c87" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:20 Sunday 28 January 2024 06:01:44 +0000 (0:00:00.204) 0:02:09.964 ******** ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003467", "end": "2024-01-28 06:01:44.900746", "rc": 0, "start": "2024-01-28 06:01:44.897279" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jan 16 13:05:56 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=9db35ec2-66ac-4531-8ad6-ffb8154c9c87 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test2 /opt/test2 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:25 Sunday 28 January 2024 06:01:44 +0000 (0:00:00.202) 0:02:10.167 ******** ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003547", "end": "2024-01-28 06:01:45.104109", "failed_when_result": false, "rc": 0, "start": "2024-01-28 06:01:45.100562" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:34 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.204) 0:02:10.372 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml for sut TASK [Set _storage_pool_tests] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:5 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.025) 0:02:10.397 ******** ok: [sut] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:18 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.011) 0:02:10.409 ******** ok: [sut] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.035645", "end": "2024-01-28 06:01:45.380096", "rc": 0, "start": "2024-01-28 06:01:45.344451" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:24 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.238) 0:02:10.648 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:34 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.018) 0:02:10.666 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-volumes.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:2 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.026) 0:02:10.693 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:13 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.017) 0:02:10.711 ******** ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:22 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.199) 0:02:10.910 ******** ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:27 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.015) 0:02:10.926 ******** ok: [sut] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:33 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.018) 0:02:10.945 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:42 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.017) 0:02:10.963 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:48 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.015) 0:02:10.978 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:54 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.015) 0:02:10.993 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:59 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.012) 0:02:11.005 ******** ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:73 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.019) 0:02:11.025 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml for sut TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:8 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.021) 0:02:11.047 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:14 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.012) 0:02:11.059 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:21 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.011) 0:02:11.070 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:28 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.012) 0:02:11.083 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:35 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.012) 0:02:11.095 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:45 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.011) 0:02:11.107 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:54 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.011) 0:02:11.118 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:64 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.011) 0:02:11.129 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:74 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.011) 0:02:11.141 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:85 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.012) 0:02:11.153 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:95 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.012) 0:02:11.166 ******** ok: [sut] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:76 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.011) 0:02:11.177 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-lvmraid.yml for sut TASK [Validate pool member LVM RAID settings] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-lvmraid.yml:2 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.030) 0:02:11.208 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml for sut TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:8 Sunday 28 January 2024 06:01:45 +0000 (0:00:00.033) 0:02:11.242 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:16 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.013) 0:02:11.255 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:21 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.023) 0:02:11.278 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:29 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.014) 0:02:11.293 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:34 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.021) 0:02:11.314 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:40 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.012) 0:02:11.327 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:46 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.012) 0:02:11.340 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:8 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.012) 0:02:11.352 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:16 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.012) 0:02:11.365 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:21 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.012) 0:02:11.378 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:29 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.013) 0:02:11.391 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:34 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.012) 0:02:11.404 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:40 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.011) 0:02:11.415 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:46 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.011) 0:02:11.427 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:79 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.011) 0:02:11.438 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-thin.yml for sut TASK [Validate pool member thinpool settings] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-thin.yml:2 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.024) 0:02:11.462 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml for sut TASK [Get information about thinpool] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:8 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.028) 0:02:11.491 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:16 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.011) 0:02:11.502 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:23 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.013) 0:02:11.516 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:27 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.012) 0:02:11.529 ******** ok: [sut] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Get information about thinpool] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:8 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.013) 0:02:11.542 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:16 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.012) 0:02:11.555 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:23 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.016) 0:02:11.571 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:27 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.015) 0:02:11.587 ******** ok: [sut] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:82 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.040) 0:02:11.627 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:5 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.026) 0:02:11.653 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:13 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.018) 0:02:11.672 ******** skipping: [sut] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:20 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.015) 0:02:11.688 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml for sut TASK [Set variables used by tests] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:2 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.024) 0:02:11.712 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:9 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.015) 0:02:11.728 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:18 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.017) 0:02:11.745 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:27 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.017) 0:02:11.763 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:37 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.018) 0:02:11.782 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:47 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.016) 0:02:11.799 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:27 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.015) 0:02:11.814 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:85 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.012) 0:02:11.826 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-vdo.yml for sut TASK [Validate pool member VDO settings] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-vdo.yml:2 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.030) 0:02:11.857 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml for sut TASK [Get information about VDO deduplication] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:9 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.039) 0:02:11.897 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:16 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.014) 0:02:11.911 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:22 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.023) 0:02:11.935 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:28 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.014) 0:02:11.949 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:35 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.013) 0:02:11.963 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:41 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.012) 0:02:11.975 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:47 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.013) 0:02:11.989 ******** ok: [sut] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Get information about VDO deduplication] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:9 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.011) 0:02:12.000 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:16 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.012) 0:02:12.013 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:22 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.013) 0:02:12.026 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:28 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.012) 0:02:12.038 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:35 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.011) 0:02:12.050 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:41 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.012) 0:02:12.062 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:47 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.011) 0:02:12.074 ******** ok: [sut] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:88 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.011) 0:02:12.085 ******** ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-volumes.yml:3 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.012) 0:02:12.097 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:2 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.026) 0:02:12.124 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:21 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.014) 0:02:12.138 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:7 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.056) 0:02:12.195 ******** ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:16 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.015) 0:02:12.211 ******** ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:38 Sunday 28 January 2024 06:01:46 +0000 (0:00:00.019) 0:02:12.230 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:51 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.012) 0:02:12.243 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:63 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.012) 0:02:12.256 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:71 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.015) 0:02:12.271 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:83 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.026) 0:02:12.298 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:95 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.014) 0:02:12.312 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:110 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.017) 0:02:12.330 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:122 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.013) 0:02:12.344 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:128 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.012) 0:02:12.357 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:134 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.013) 0:02:12.370 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:146 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.012) 0:02:12.383 ******** ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:2 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.011) 0:02:12.394 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:40 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.024) 0:02:12.419 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:48 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.011) 0:02:12.430 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:58 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.015) 0:02:12.446 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:71 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.033) 0:02:12.479 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:3 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.012) 0:02:12.492 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:12 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.012) 0:02:12.505 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:3 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.011) 0:02:12.516 ******** ok: [sut] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:9 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.199) 0:02:12.716 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:16 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.014) 0:02:12.730 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:24 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.018) 0:02:12.749 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:30 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.010) 0:02:12.760 ******** ok: [sut] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:34 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.022) 0:02:12.782 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:39 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.015) 0:02:12.798 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:3 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.021) 0:02:12.819 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:10 Sunday 28 January 2024 06:01:47 +0000 (0:00:00.017) 0:02:12.837 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:17 Sunday 28 January 2024 06:01:50 +0000 (0:00:02.482) 0:02:15.320 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:23 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.333 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:32 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.011) 0:02:15.344 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:45 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.009) 0:02:15.353 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:51 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.365 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:56 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.011) 0:02:15.376 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:69 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.009) 0:02:15.385 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:81 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.009) 0:02:15.394 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:94 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.010) 0:02:15.404 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:106 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.015) 0:02:15.420 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:114 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.015) 0:02:15.436 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:122 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.449 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:131 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.011) 0:02:15.460 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:140 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.011) 0:02:15.472 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:8 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.011) 0:02:15.483 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:14 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.011) 0:02:15.494 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:21 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.013) 0:02:15.508 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:28 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.011) 0:02:15.519 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:35 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.011) 0:02:15.530 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:45 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.011) 0:02:15.542 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:54 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.554 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:63 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.567 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:72 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.016) 0:02:15.583 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:81 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.013) 0:02:15.597 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:3 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.609 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:11 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.622 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:20 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.635 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:28 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.011) 0:02:15.647 ******** ok: [sut] => { "storage_test_expected_size": "2681670205.25" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:32 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.015) 0:02:15.662 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:46 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.013) 0:02:15.676 ******** skipping: [sut] => {} TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:50 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.013) 0:02:15.689 ******** skipping: [sut] => {} TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:54 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.702 ******** skipping: [sut] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:58 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.715 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:68 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.727 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:72 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.014) 0:02:15.742 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:77 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.754 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:83 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.767 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:88 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.780 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:96 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.011) 0:02:15.792 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:104 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.805 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:109 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.013) 0:02:15.818 ******** skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:113 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.831 ******** skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:117 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.844 ******** skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:121 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.856 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:129 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.011) 0:02:15.868 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:138 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.011) 0:02:15.880 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:142 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.014) 0:02:15.894 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:150 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.013) 0:02:15.908 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:156 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.013) 0:02:15.921 ******** ok: [sut] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:160 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.013) 0:02:15.935 ******** ok: [sut] => { "storage_test_expected_size": "2681670205.25" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:164 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.015) 0:02:15.950 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:5 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.963 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:13 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.013) 0:02:15.977 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:18 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:15.989 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:27 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:16.002 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:35 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.011) 0:02:16.013 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:41 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.011) 0:02:16.024 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:47 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.011) 0:02:16.036 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:27 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.032) 0:02:16.068 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:2 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:16.080 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:21 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.015) 0:02:16.096 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:7 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.054) 0:02:16.150 ******** ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:16 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.015) 0:02:16.166 ******** ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 588749, "block_size": 4096, "block_total": 625618, "block_used": 36869, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 163829, "inode_total": 163840, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime,stripe=2048", "size_available": 2411515904, "size_total": 2562531328, "uuid": "535527a1-073c-487e-98a8-36b792e0b690" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 588749, "block_size": 4096, "block_total": 625618, "block_used": 36869, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 163829, "inode_total": 163840, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime,stripe=2048", "size_available": 2411515904, "size_total": 2562531328, "uuid": "535527a1-073c-487e-98a8-36b792e0b690" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:38 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.019) 0:02:16.185 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:51 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.012) 0:02:16.198 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:63 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.015) 0:02:16.214 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:71 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.014) 0:02:16.228 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:83 Sunday 28 January 2024 06:01:50 +0000 (0:00:00.011) 0:02:16.240 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:95 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.012) 0:02:16.252 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:110 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.011) 0:02:16.263 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:122 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.015) 0:02:16.279 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:128 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.011) 0:02:16.290 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:134 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.011) 0:02:16.301 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:146 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.011) 0:02:16.312 ******** ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:2 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.012) 0:02:16.325 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:40 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.024) 0:02:16.349 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:48 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.014) 0:02:16.364 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:58 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.013) 0:02:16.378 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:71 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.011) 0:02:16.389 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:3 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.011) 0:02:16.400 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:12 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.018) 0:02:16.419 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:3 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.016) 0:02:16.435 ******** ok: [sut] => { "changed": false, "stat": { "atime": 1706421701.2268114, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1706421673.1708035, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 776, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1706421673.1708035, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:9 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.205) 0:02:16.641 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:16 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.015) 0:02:16.657 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:24 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.012) 0:02:16.669 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:30 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.014) 0:02:16.684 ******** ok: [sut] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:34 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.012) 0:02:16.696 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:39 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.011) 0:02:16.708 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:3 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.014) 0:02:16.722 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:10 Sunday 28 January 2024 06:01:51 +0000 (0:00:00.011) 0:02:16.734 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:17 Sunday 28 January 2024 06:01:53 +0000 (0:00:02.398) 0:02:19.133 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:23 Sunday 28 January 2024 06:01:53 +0000 (0:00:00.034) 0:02:19.167 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:32 Sunday 28 January 2024 06:01:53 +0000 (0:00:00.013) 0:02:19.180 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:45 Sunday 28 January 2024 06:01:53 +0000 (0:00:00.016) 0:02:19.197 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:51 Sunday 28 January 2024 06:01:53 +0000 (0:00:00.012) 0:02:19.209 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:56 Sunday 28 January 2024 06:01:53 +0000 (0:00:00.012) 0:02:19.222 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:69 Sunday 28 January 2024 06:01:53 +0000 (0:00:00.012) 0:02:19.235 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:81 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.012) 0:02:19.248 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:94 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:19.259 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:106 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.015) 0:02:19.275 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:114 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.014) 0:02:19.289 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:122 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:19.301 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:131 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:19.312 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:140 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.012) 0:02:19.324 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:8 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:19.336 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:14 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.012) 0:02:19.349 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:21 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.012) 0:02:19.362 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:28 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.012) 0:02:19.374 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:35 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:19.385 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:45 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.012) 0:02:19.398 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:54 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:19.409 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:63 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:19.420 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:72 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:19.431 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:81 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.012) 0:02:19.443 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:3 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:19.455 ******** ok: [sut] => { "bytes": 2684354560, "changed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:11 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.204) 0:02:19.659 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:20 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.015) 0:02:19.674 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:28 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.014) 0:02:19.689 ******** ok: [sut] => { "storage_test_expected_size": "2681670205.25" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:32 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.012) 0:02:19.702 ******** ok: [sut] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:46 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.201) 0:02:19.903 ******** ok: [sut] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "60%", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "25%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } } TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:50 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.019) 0:02:19.923 ******** ok: [sut] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "2.5G", "type": "lvm", "uuid": "535527a1-073c-487e-98a8-36b792e0b690" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "urYyra-dHsd-6wqb-JL2Q-5qRG-9e7s-c6khtG" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "9db35ec2-66ac-4531-8ad6-ffb8154c9c87" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } } TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:54 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.020) 0:02:19.943 ******** ok: [sut] => { "storage_test_pool_size": { "bytes": 10726680821, "changed": false, "failed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:58 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.018) 0:02:19.962 ******** ok: [sut] => { "ansible_facts": { "storage_test_expected_size": "2681670205.25" }, "changed": false } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:68 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.018) 0:02:19.981 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:72 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.012) 0:02:19.994 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:77 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.012) 0:02:20.006 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:83 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.012) 0:02:20.019 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:88 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:20.030 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:96 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:20.041 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:104 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:20.052 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:109 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:20.064 ******** skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:113 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:20.075 ******** skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:117 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.012) 0:02:20.087 ******** skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:121 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:20.098 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:129 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:20.110 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:138 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:20.121 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:142 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:20.132 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:150 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.011) 0:02:20.143 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:156 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.012) 0:02:20.155 ******** ok: [sut] => { "storage_test_actual_size": { "bytes": 2684354560, "changed": false, "failed": false, "lvm": "2g", "parted": "2GiB", "size": "2 GiB" } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:160 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.013) 0:02:20.169 ******** ok: [sut] => { "storage_test_expected_size": "2681670205.25" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:164 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.012) 0:02:20.181 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:5 Sunday 28 January 2024 06:01:54 +0000 (0:00:00.016) 0:02:20.198 ******** ok: [sut] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.036295", "end": "2024-01-28 06:01:55.167627", "rc": 0, "start": "2024-01-28 06:01:55.131332" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:13 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.236) 0:02:20.434 ******** ok: [sut] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:18 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.016) 0:02:20.450 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:27 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.017) 0:02:20.468 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:35 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.013) 0:02:20.481 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:41 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.014) 0:02:20.496 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:47 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.013) 0:02:20.509 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:27 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.014) 0:02:20.523 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:44 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.011) 0:02:20.535 ******** TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:54 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.010) 0:02:20.546 ******** ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Get the size of test2 volume again] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:126 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.011) 0:02:20.557 ******** ok: [sut] => { "changed": false, "cmd": [ "lsblk", "--noheadings", "-o", "SIZE", "/dev/mapper/foo-test2" ], "delta": "0:00:00.004574", "end": "2024-01-28 06:01:55.493781", "rc": 0, "start": "2024-01-28 06:01:55.489207" } STDOUT: 2.5G TASK [Verify that removing test1 didn't cause a change in test2 size] ********** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:131 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.202) 0:02:20.760 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Grow test2 using a percentage-based size spec] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:136 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.015) 0:02:20.775 ******** TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.058) 0:02:20.834 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.017) 0:02:20.851 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.016) 0:02:20.867 ******** skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_38.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_38.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_38.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_38.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.031) 0:02:20.899 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.012) 0:02:20.912 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.012) 0:02:20.924 ******** ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.011) 0:02:20.936 ******** ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.013) 0:02:20.949 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 28 January 2024 06:01:55 +0000 (0:00:00.025) 0:02:20.974 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 28 January 2024 06:01:58 +0000 (0:00:02.435) 0:02:23.409 ******** ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "present", "volumes": [ { "mount_point": "/opt/test2", "name": "test2", "size": "50%" } ] } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 28 January 2024 06:01:58 +0000 (0:00:00.014) 0:02:23.424 ******** ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 28 January 2024 06:01:58 +0000 (0:00:00.012) 0:02:23.437 ******** ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 28 January 2024 06:02:00 +0000 (0:00:02.002) 0:02:25.439 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 28 January 2024 06:02:00 +0000 (0:00:00.022) 0:02:25.461 ******** skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 28 January 2024 06:02:00 +0000 (0:00:00.022) 0:02:25.484 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 28 January 2024 06:02:00 +0000 (0:00:00.013) 0:02:25.497 ******** skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 28 January 2024 06:02:00 +0000 (0:00:00.019) 0:02:25.516 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 28 January 2024 06:02:02 +0000 (0:00:02.450) 0:02:27.967 ******** ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.home1.service": { "name": "dbus-org.freedesktop.home1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "ssh-host-keys-migration.service": { "name": "ssh-host-keys-migration.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 28 January 2024 06:02:04 +0000 (0:00:02.260) 0:02:30.227 ******** ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 28 January 2024 06:02:05 +0000 (0:00:00.022) 0:02:30.250 ******** TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 28 January 2024 06:02:05 +0000 (0:00:00.013) 0:02:30.263 ******** changed: [sut] => { "actions": [ { "action": "resize device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" } ], "changed": true, "crypts": [], "leaves": [ "/dev/mapper/foo-test2", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "lvm2", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "50%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 28 January 2024 06:02:07 +0000 (0:00:02.535) 0:02:32.798 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 28 January 2024 06:02:07 +0000 (0:00:00.013) 0:02:32.811 ******** TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 28 January 2024 06:02:07 +0000 (0:00:00.011) 0:02:32.823 ******** ok: [sut] => { "blivet_output": { "actions": [ { "action": "resize device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "resize format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/mapper/foo-test2", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" } ], "packages": [ "lvm2", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "50%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 28 January 2024 06:02:07 +0000 (0:00:00.040) 0:02:32.863 ******** ok: [sut] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "50%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:129 Sunday 28 January 2024 06:02:07 +0000 (0:00:00.015) 0:02:32.879 ******** ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:145 Sunday 28 January 2024 06:02:07 +0000 (0:00:00.013) 0:02:32.892 ******** TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:157 Sunday 28 January 2024 06:02:07 +0000 (0:00:00.011) 0:02:32.903 ******** ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:162 Sunday 28 January 2024 06:02:08 +0000 (0:00:00.991) 0:02:33.895 ******** changed: [sut] => (item={'src': '/dev/mapper/foo-test2', 'path': '/opt/test2', 'fstype': 'ext4', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:174 Sunday 28 January 2024 06:02:08 +0000 (0:00:00.228) 0:02:34.123 ******** skipping: [sut] => (item={'src': '/dev/mapper/foo-test2', 'path': '/opt/test2', 'fstype': 'ext4', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "ext4", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:189 Sunday 28 January 2024 06:02:08 +0000 (0:00:00.023) 0:02:34.147 ******** ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:197 Sunday 28 January 2024 06:02:09 +0000 (0:00:00.989) 0:02:35.137 ******** ok: [sut] => { "changed": false, "stat": { "atime": 1706421590.370748, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1705410883.285937, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131081, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1705410356.493937, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3072151005", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:202 Sunday 28 January 2024 06:02:10 +0000 (0:00:00.210) 0:02:35.347 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:224 Sunday 28 January 2024 06:02:10 +0000 (0:00:00.011) 0:02:35.359 ******** ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:149 Sunday 28 January 2024 06:02:10 +0000 (0:00:00.795) 0:02:36.154 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:2 Sunday 28 January 2024 06:02:10 +0000 (0:00:00.026) 0:02:36.181 ******** ok: [sut] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "50%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:7 Sunday 28 January 2024 06:02:10 +0000 (0:00:00.016) 0:02:36.197 ******** skipping: [sut] => {} TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:15 Sunday 28 January 2024 06:02:10 +0000 (0:00:00.012) 0:02:36.209 ******** ok: [sut] => { "changed": false, "info": { "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "5G", "type": "lvm", "uuid": "535527a1-073c-487e-98a8-36b792e0b690" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "urYyra-dHsd-6wqb-JL2Q-5qRG-9e7s-c6khtG" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "9db35ec2-66ac-4531-8ad6-ffb8154c9c87" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:20 Sunday 28 January 2024 06:02:11 +0000 (0:00:00.209) 0:02:36.419 ******** ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003534", "end": "2024-01-28 06:02:11.358506", "rc": 0, "start": "2024-01-28 06:02:11.354972" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jan 16 13:05:56 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=9db35ec2-66ac-4531-8ad6-ffb8154c9c87 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test2 /opt/test2 ext4 defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:25 Sunday 28 January 2024 06:02:11 +0000 (0:00:00.206) 0:02:36.626 ******** ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003568", "end": "2024-01-28 06:02:11.566420", "failed_when_result": false, "rc": 0, "start": "2024-01-28 06:02:11.562852" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:34 Sunday 28 January 2024 06:02:11 +0000 (0:00:00.208) 0:02:36.835 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml for sut TASK [Set _storage_pool_tests] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:5 Sunday 28 January 2024 06:02:11 +0000 (0:00:00.024) 0:02:36.859 ******** ok: [sut] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:18 Sunday 28 January 2024 06:02:11 +0000 (0:00:00.011) 0:02:36.870 ******** ok: [sut] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.035907", "end": "2024-01-28 06:02:11.844405", "rc": 0, "start": "2024-01-28 06:02:11.808498" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:24 Sunday 28 January 2024 06:02:11 +0000 (0:00:00.241) 0:02:37.112 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:34 Sunday 28 January 2024 06:02:11 +0000 (0:00:00.017) 0:02:37.129 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-volumes.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:2 Sunday 28 January 2024 06:02:11 +0000 (0:00:00.046) 0:02:37.176 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:13 Sunday 28 January 2024 06:02:11 +0000 (0:00:00.018) 0:02:37.194 ******** ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:22 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.207) 0:02:37.402 ******** ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:27 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.014) 0:02:37.417 ******** ok: [sut] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:33 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.014) 0:02:37.431 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:42 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.015) 0:02:37.447 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:48 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.013) 0:02:37.461 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:54 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.014) 0:02:37.475 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:59 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.012) 0:02:37.488 ******** ok: [sut] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:73 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.018) 0:02:37.506 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml for sut TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:8 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.021) 0:02:37.528 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:14 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.539 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:21 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.012) 0:02:37.551 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:28 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.562 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:35 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.573 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:45 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.585 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:54 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.596 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:64 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.607 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:74 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.012) 0:02:37.619 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:85 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.630 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:95 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.642 ******** ok: [sut] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:76 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.653 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-lvmraid.yml for sut TASK [Validate pool member LVM RAID settings] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-lvmraid.yml:2 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.023) 0:02:37.676 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml for sut TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:8 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.026) 0:02:37.703 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:16 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.012) 0:02:37.715 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:21 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.727 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:29 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.738 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:34 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.012) 0:02:37.751 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:40 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.762 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-lvmraid.yml:46 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.773 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:79 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.784 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-thin.yml for sut TASK [Validate pool member thinpool settings] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-thin.yml:2 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.024) 0:02:37.809 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml for sut TASK [Get information about thinpool] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:8 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.023) 0:02:37.833 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:16 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.844 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:23 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.855 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-thin.yml:27 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.866 ******** ok: [sut] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:82 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.012) 0:02:37.879 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:5 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.025) 0:02:37.904 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:13 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.013) 0:02:37.917 ******** skipping: [sut] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:20 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.013) 0:02:37.931 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml for sut TASK [Set variables used by tests] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:2 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.022) 0:02:37.954 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:9 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.013) 0:02:37.967 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:18 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.014) 0:02:37.982 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:27 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:37.993 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:37 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.012) 0:02:38.005 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-crypttab.yml:47 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:38.016 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:27 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:38.027 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:85 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:38.038 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-vdo.yml for sut TASK [Validate pool member VDO settings] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-vdo.yml:2 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.026) 0:02:38.064 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml for sut TASK [Get information about VDO deduplication] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:9 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.065) 0:02:38.130 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:16 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.012) 0:02:38.143 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:22 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.012) 0:02:38.156 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:28 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.013) 0:02:38.169 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:35 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.012) 0:02:38.181 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:41 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.012) 0:02:38.194 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-member-vdo.yml:47 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:38.205 ******** ok: [sut] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:88 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:38.217 ******** ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-volumes.yml:3 Sunday 28 January 2024 06:02:12 +0000 (0:00:00.011) 0:02:38.228 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml for sut TASK [Set storage volume test variables] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:2 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.023) 0:02:38.251 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:21 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.014) 0:02:38.266 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml for sut TASK [Get expected mount device based on device type] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:7 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.055) 0:02:38.321 ******** ok: [sut] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:16 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.015) 0:02:38.337 ******** ok: [sut] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1199064, "block_size": 4096, "block_total": 1268648, "block_used": 69584, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime,stripe=2048", "size_available": 4911366144, "size_total": 5196382208, "uuid": "535527a1-073c-487e-98a8-36b792e0b690" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1199064, "block_size": 4096, "block_total": 1268648, "block_used": 69584, "device": "/dev/mapper/foo-test2", "fstype": "ext4", "inode_available": 327669, "inode_total": 327680, "inode_used": 11, "mount": "/opt/test2", "options": "rw,seclabel,relatime,stripe=2048", "size_available": 4911366144, "size_total": 5196382208, "uuid": "535527a1-073c-487e-98a8-36b792e0b690" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:38 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.019) 0:02:38.356 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:51 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.012) 0:02:38.369 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:63 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.015) 0:02:38.384 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:71 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.014) 0:02:38.399 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:83 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.011) 0:02:38.410 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:95 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.012) 0:02:38.422 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the mount fs type] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:110 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.011) 0:02:38.433 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get path of test volume device] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:122 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.014) 0:02:38.448 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:128 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.011) 0:02:38.459 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:134 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.011) 0:02:38.471 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-mount.yml:146 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.011) 0:02:38.482 ******** ok: [sut] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:2 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.012) 0:02:38.494 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 ext4 defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:40 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.024) 0:02:38.518 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:48 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.014) 0:02:38.533 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:58 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.014) 0:02:38.548 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fstab.yml:71 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.011) 0:02:38.559 ******** ok: [sut] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:3 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.014) 0:02:38.573 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-fs.yml:12 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.017) 0:02:38.590 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:3 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.017) 0:02:38.608 ******** ok: [sut] => { "changed": false, "stat": { "atime": 1706421728.85384, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1706421727.455889, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 776, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1706421727.455889, "nlink": 1, "path": "/dev/mapper/foo-test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:9 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.209) 0:02:38.817 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:16 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.015) 0:02:38.833 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:24 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.012) 0:02:38.845 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:30 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.014) 0:02:38.860 ******** ok: [sut] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:34 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.012) 0:02:38.872 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-device.yml:39 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.011) 0:02:38.883 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:3 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.014) 0:02:38.898 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:10 Sunday 28 January 2024 06:02:13 +0000 (0:00:00.011) 0:02:38.909 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:17 Sunday 28 January 2024 06:02:16 +0000 (0:00:02.428) 0:02:41.338 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:23 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.015) 0:02:41.353 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:32 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:41.365 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:45 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.016) 0:02:41.381 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:51 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.012) 0:02:41.394 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:56 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:41.405 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:69 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:41.417 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:81 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.012) 0:02:41.429 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:94 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:41.440 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:106 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.015) 0:02:41.456 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:114 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.014) 0:02:41.471 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:122 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:41.482 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:131 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:41.493 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:140 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.012) 0:02:41.506 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:8 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:41.517 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:14 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:41.528 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:21 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.012) 0:02:41.541 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:28 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:41.552 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:35 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.012) 0:02:41.564 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:45 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.012) 0:02:41.577 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:54 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:41.588 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:63 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:41.599 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:72 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:41.611 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-md.yml:81 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:41.622 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:3 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:41.633 ******** ok: [sut] => { "bytes": 5368709120, "changed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:11 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.208) 0:02:41.841 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:20 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.014) 0:02:41.856 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:28 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.013) 0:02:41.870 ******** ok: [sut] => { "storage_test_expected_size": "2681670205.25" } TASK [Get the size of parent/pool device] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:32 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.012) 0:02:41.882 ******** ok: [sut] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:46 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.204) 0:02:42.087 ******** ok: [sut] => { "storage_test_pool": { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test2", "_raw_device": "/dev/mapper/foo-test2", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "ext4", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "50%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } } TASK [Show test blockinfo] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:50 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.020) 0:02:42.107 ******** ok: [sut] => { "storage_test_blkinfo": { "changed": false, "failed": false, "info": { "/dev/mapper/foo-test2": { "fstype": "ext4", "label": "", "name": "/dev/mapper/foo-test2", "size": "5G", "type": "lvm", "uuid": "535527a1-073c-487e-98a8-36b792e0b690" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "urYyra-dHsd-6wqb-JL2Q-5qRG-9e7s-c6khtG" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "9db35ec2-66ac-4531-8ad6-ffb8154c9c87" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } } TASK [Show test pool size] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:54 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.018) 0:02:42.125 ******** ok: [sut] => { "storage_test_pool_size": { "bytes": 10726680821, "changed": false, "failed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:58 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.016) 0:02:42.142 ******** ok: [sut] => { "ansible_facts": { "storage_test_expected_size": "5363340410.5" }, "changed": false } TASK [Default thin pool reserved space values] ********************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:68 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.016) 0:02:42.159 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:72 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:42.170 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:77 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.012) 0:02:42.182 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:83 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:42.194 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:88 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:42.205 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:96 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:42.216 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:104 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:42.227 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:109 Sunday 28 January 2024 06:02:16 +0000 (0:00:00.011) 0:02:42.238 ******** skipping: [sut] => {} TASK [Show volume thin pool size] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:113 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.012) 0:02:42.251 ******** skipping: [sut] => {} TASK [Show test volume size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:117 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.011) 0:02:42.262 ******** skipping: [sut] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:121 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.011) 0:02:42.273 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:129 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.011) 0:02:42.284 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:138 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.011) 0:02:42.296 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:142 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.011) 0:02:42.307 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:150 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.012) 0:02:42.319 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:156 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.011) 0:02:42.330 ******** ok: [sut] => { "storage_test_actual_size": { "bytes": 5368709120, "changed": false, "failed": false, "lvm": "5g", "parted": "5GiB", "size": "5 GiB" } } TASK [Show expected size] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:160 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.013) 0:02:42.344 ******** ok: [sut] => { "storage_test_expected_size": "5363340410.5" } TASK [Assert expected size is actual size] ************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-size.yml:164 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.012) 0:02:42.356 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:5 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.016) 0:02:42.373 ******** ok: [sut] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test2" ], "delta": "0:00:00.036512", "end": "2024-01-28 06:02:17.347169", "rc": 0, "start": "2024-01-28 06:02:17.310657" } STDOUT: LVM2_LV_NAME=test2 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:13 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.242) 0:02:42.615 ******** ok: [sut] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:18 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.015) 0:02:42.631 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:27 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.015) 0:02:42.647 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:35 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.012) 0:02:42.660 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:41 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.012) 0:02:42.672 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-cache.yml:47 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.012) 0:02:42.684 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume.yml:27 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.012) 0:02:42.697 ******** ok: [sut] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:44 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.011) 0:02:42.708 ******** TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:54 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.009) 0:02:42.718 ******** ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove both of the LVM logical volumes in 'foo' created above] *********** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:152 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.032) 0:02:42.750 ******** TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.039) 0:02:42.790 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for sut TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.017) 0:02:42.807 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.014) 0:02:42.822 ******** skipping: [sut] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [sut] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [sut] => (item=Fedora_38.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_38.yml", "skip_reason": "Conditional result was False" } skipping: [sut] => (item=Fedora_38.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_38.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Check if system is ostree] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:26 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.030) 0:02:42.852 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Set flag to indicate system is ostree] ****** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:31 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.011) 0:02:42.864 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Define an empty list of pools to be used in testing] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.012) 0:02:42.876 ******** ok: [sut] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : Define an empty list of volumes to be used in testing] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.011) 0:02:42.887 ******** ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Include the appropriate provider tasks] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.011) 0:02:42.899 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for sut TASK [linux-system-roles.storage : Make sure blivet is available] ************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Sunday 28 January 2024 06:02:17 +0000 (0:00:00.025) 0:02:42.924 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet TASK [linux-system-roles.storage : Show storage_pools] ************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:11 Sunday 28 January 2024 06:02:20 +0000 (0:00:02.425) 0:02:45.349 ******** ok: [sut] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "state": "absent" } ] } TASK [linux-system-roles.storage : Show storage_volumes] *********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:16 Sunday 28 January 2024 06:02:20 +0000 (0:00:00.014) 0:02:45.363 ******** ok: [sut] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : Get required packages] ********************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:21 Sunday 28 January 2024 06:02:20 +0000 (0:00:00.012) 0:02:45.376 ******** ok: [sut] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Enable copr repositories if needed] ********* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:34 Sunday 28 January 2024 06:02:22 +0000 (0:00:02.032) 0:02:47.409 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for sut TASK [linux-system-roles.storage : Check if the COPR support packages should be installed] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Sunday 28 January 2024 06:02:22 +0000 (0:00:00.021) 0:02:47.431 ******** skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure COPR support packages are present] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Sunday 28 January 2024 06:02:22 +0000 (0:00:00.018) 0:02:47.449 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Enable COPRs] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:20 Sunday 28 January 2024 06:02:22 +0000 (0:00:00.011) 0:02:47.460 ******** skipping: [sut] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Make sure required packages are installed] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Sunday 28 January 2024 06:02:22 +0000 (0:00:00.018) 0:02:47.478 ******** ok: [sut] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [linux-system-roles.storage : Get service facts] ************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:57 Sunday 28 January 2024 06:02:24 +0000 (0:00:02.437) 0:02:49.916 ******** ok: [sut] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.home1.service": { "name": "dbus-org.freedesktop.home1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "ssh-host-keys-migration.service": { "name": "ssh-host-keys-migration.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:64 Sunday 28 January 2024 06:02:26 +0000 (0:00:02.224) 0:02:52.140 ******** ok: [sut] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:78 Sunday 28 January 2024 06:02:26 +0000 (0:00:00.020) 0:02:52.161 ******** TASK [linux-system-roles.storage : Manage the pools and volumes to match the specified state] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Sunday 28 January 2024 06:02:26 +0000 (0:00:00.011) 0:02:52.172 ******** changed: [sut] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:98 Sunday 28 January 2024 06:02:29 +0000 (0:00:02.744) 0:02:54.917 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:110 Sunday 28 January 2024 06:02:29 +0000 (0:00:00.012) 0:02:54.929 ******** TASK [linux-system-roles.storage : Show blivet_output] ************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:116 Sunday 28 January 2024 06:02:29 +0000 (0:00:00.011) 0:02:54.941 ******** ok: [sut] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test2", "fs_type": "ext4" }, { "action": "destroy device", "device": "/dev/mapper/foo-test2", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "ext4", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "lvm", "volumes": [] } ], "volumes": [] } } TASK [linux-system-roles.storage : Set the list of pools for test verification] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:125 Sunday 28 January 2024 06:02:29 +0000 (0:00:00.014) 0:02:54.955 ******** ok: [sut] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "lvm", "volumes": [] } ] }, "changed": false } TASK [linux-system-roles.storage : Set the list of volumes for test verification] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:129 Sunday 28 January 2024 06:02:29 +0000 (0:00:00.013) 0:02:54.969 ******** ok: [sut] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : Remove obsolete mounts] ********************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:145 Sunday 28 January 2024 06:02:29 +0000 (0:00:00.013) 0:02:54.982 ******** changed: [sut] => (item={'src': '/dev/mapper/foo-test2', 'path': '/opt/test2', 'state': 'absent', 'fstype': 'ext4'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "ext4", "mount_info": { "fstype": "ext4", "path": "/opt/test2", "src": "/dev/mapper/foo-test2", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test2" } TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:157 Sunday 28 January 2024 06:02:29 +0000 (0:00:00.212) 0:02:55.195 ******** ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Set up new/current mounts] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:162 Sunday 28 January 2024 06:02:30 +0000 (0:00:00.970) 0:02:56.166 ******** TASK [linux-system-roles.storage : Manage mount ownership/permissions] ********* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:174 Sunday 28 January 2024 06:02:30 +0000 (0:00:00.012) 0:02:56.178 ******** TASK [linux-system-roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:189 Sunday 28 January 2024 06:02:30 +0000 (0:00:00.011) 0:02:56.190 ******** ok: [sut] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:197 Sunday 28 January 2024 06:02:31 +0000 (0:00:00.961) 0:02:57.151 ******** ok: [sut] => { "changed": false, "stat": { "atime": 1706421590.370748, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1705410883.285937, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131081, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1705410356.493937, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3072151005", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:202 Sunday 28 January 2024 06:02:32 +0000 (0:00:00.206) 0:02:57.357 ******** TASK [linux-system-roles.storage : Update facts] ******************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:224 Sunday 28 January 2024 06:02:32 +0000 (0:00:00.011) 0:02:57.369 ******** ok: [sut] TASK [Verify role results] ***************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/tests_lvm_percent_size.yml:161 Sunday 28 January 2024 06:02:32 +0000 (0:00:00.770) 0:02:58.139 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml for sut TASK [Print out pool information] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:2 Sunday 28 January 2024 06:02:32 +0000 (0:00:00.028) 0:02:58.167 ******** ok: [sut] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "lvm", "volumes": [] } ] } TASK [Print out volume information] ******************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:7 Sunday 28 January 2024 06:02:32 +0000 (0:00:00.014) 0:02:58.182 ******** skipping: [sut] => {} TASK [Collect info about the volumes.] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:15 Sunday 28 January 2024 06:02:32 +0000 (0:00:00.011) 0:02:58.193 ******** ok: [sut] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "9db35ec2-66ac-4531-8ad6-ffb8154c9c87" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:20 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.205) 0:02:58.398 ******** ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003441", "end": "2024-01-28 06:02:33.335203", "rc": 0, "start": "2024-01-28 06:02:33.331762" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jan 16 13:05:56 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=9db35ec2-66ac-4531-8ad6-ffb8154c9c87 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:25 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.204) 0:02:58.603 ******** ok: [sut] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003310", "end": "2024-01-28 06:02:33.537059", "failed_when_result": false, "rc": 0, "start": "2024-01-28 06:02:33.533749" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:34 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.201) 0:02:58.804 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml for sut TASK [Set _storage_pool_tests] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:5 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.022) 0:02:58.827 ******** ok: [sut] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:18 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.032) 0:02:58.859 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:24 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.012) 0:02:58.872 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool.yml:34 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.011) 0:02:58.883 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml for sut included: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-volumes.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:2 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.025) 0:02:58.908 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:13 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.017) 0:02:58.926 ******** TASK [Set pvs lvm length] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:22 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.010) 0:02:58.936 ******** ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": "0" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:27 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.013) 0:02:58.949 ******** ok: [sut] => { "ansible_facts": { "_storage_test_pool_pvs": [] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:33 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.014) 0:02:58.963 ******** ok: [sut] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:42 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.015) 0:02:58.979 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:48 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.013) 0:02:58.992 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:54 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.015) 0:02:59.008 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:59 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.012) 0:02:59.021 ******** TASK [Check MD RAID] *********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:73 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.014) 0:02:59.035 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml for sut TASK [Get information about RAID] ********************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:8 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.027) 0:02:59.063 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:14 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.013) 0:02:59.077 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:21 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.013) 0:02:59.091 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:28 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.011) 0:02:59.103 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:35 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.011) 0:02:59.114 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:45 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.010) 0:02:59.125 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:54 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.010) 0:02:59.136 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:64 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.012) 0:02:59.148 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:74 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.010) 0:02:59.159 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:85 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.010) 0:02:59.170 ******** skipping: [sut] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-md.yml:95 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.010) 0:02:59.181 ******** ok: [sut] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:76 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.011) 0:02:59.192 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-lvmraid.yml for sut TASK [Validate pool member LVM RAID settings] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-lvmraid.yml:2 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.023) 0:02:59.216 ******** TASK [Check Thin Pools] ******************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:79 Sunday 28 January 2024 06:02:33 +0000 (0:00:00.011) 0:02:59.227 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-thin.yml for sut TASK [Validate pool member thinpool settings] ********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-thin.yml:2 Sunday 28 January 2024 06:02:34 +0000 (0:00:00.022) 0:02:59.250 ******** TASK [Check member encryption] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:82 Sunday 28 January 2024 06:02:34 +0000 (0:00:00.011) 0:02:59.261 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml for sut TASK [Set test variables] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:5 Sunday 28 January 2024 06:02:34 +0000 (0:00:00.025) 0:02:59.286 ******** ok: [sut] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:13 Sunday 28 January 2024 06:02:34 +0000 (0:00:00.013) 0:02:59.299 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:20 Sunday 28 January 2024 06:02:34 +0000 (0:00:00.009) 0:02:59.309 ******** TASK [Clear test variables] **************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-encryption.yml:27 Sunday 28 January 2024 06:02:34 +0000 (0:00:00.009) 0:02:59.319 ******** ok: [sut] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:85 Sunday 28 January 2024 06:02:34 +0000 (0:00:00.011) 0:02:59.331 ******** included: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-vdo.yml for sut TASK [Validate pool member VDO settings] *************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-pool-members-vdo.yml:2 Sunday 28 January 2024 06:02:34 +0000 (0:00:00.025) 0:02:59.356 ******** TASK [Clean up test variables] ************************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-members.yml:88 Sunday 28 January 2024 06:02:34 +0000 (0:00:00.011) 0:02:59.367 ******** ok: [sut] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-pool-volumes.yml:3 Sunday 28 January 2024 06:02:34 +0000 (0:00:00.010) 0:02:59.378 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:44 Sunday 28 January 2024 06:02:34 +0000 (0:00:00.010) 0:02:59.389 ******** TASK [Clean up variable namespace] ********************************************* task path: /WORKDIR/git-weekly-ciebjrxnku/tests/verify-role-results.yml:54 Sunday 28 January 2024 06:02:34 +0000 (0:00:00.009) 0:02:59.399 ******** ok: [sut] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* sut : ok=811 changed=12 unreachable=0 failed=1 skipped=832 rescued=1 ignored=0 Sunday 28 January 2024 06:02:34 +0000 (0:00:00.006) 0:02:59.405 ******** =============================================================================== linux-system-roles.storage : Make sure blivet is available -------------- 6.77s /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Ensure cryptsetup is present -------------------------------------------- 3.48s /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:10 ----- linux-system-roles.storage : Manage the pools and volumes to match the specified state --- 2.99s /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 linux-system-roles.storage : Manage the pools and volumes to match the specified state --- 2.96s /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 linux-system-roles.storage : Manage the pools and volumes to match the specified state --- 2.85s /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 linux-system-roles.storage : Manage the pools and volumes to match the specified state --- 2.74s /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 linux-system-roles.storage : Make sure required packages are installed --- 2.70s /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 linux-system-roles.storage : Make sure blivet is available -------------- 2.60s /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 linux-system-roles.storage : Manage the pools and volumes to match the specified state --- 2.54s /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 linux-system-roles.storage : Manage the pools and volumes to match the specified state --- 2.49s /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:84 Ensure cryptsetup is present -------------------------------------------- 2.48s /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:10 ----- linux-system-roles.storage : Make sure required packages are installed --- 2.45s /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 linux-system-roles.storage : Make sure blivet is available -------------- 2.45s /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 linux-system-roles.storage : Make sure required packages are installed --- 2.45s /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 Ensure cryptsetup is present -------------------------------------------- 2.45s /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:10 ----- Ensure cryptsetup is present -------------------------------------------- 2.44s /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:10 ----- linux-system-roles.storage : Make sure blivet is available -------------- 2.44s /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Ensure cryptsetup is present -------------------------------------------- 2.44s /WORKDIR/git-weekly-ciebjrxnku/tests/test-verify-volume-encryption.yml:10 ----- linux-system-roles.storage : Make sure required packages are installed --- 2.44s /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 linux-system-roles.storage : Make sure required packages are installed --- 2.44s /WORKDIR/git-weekly-ciebjrxnku/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:41 ---^---^---^---^---^--- # STDERR: ---v---v---v---v---v--- ---^---^---^---^---^---