ansible-playbook [core 2.12.6] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /tmp/tmp5bkr4li_ executable location = /usr/bin/ansible-playbook python version = 3.9.13 (main, May 18 2022, 00:00:00) [GCC 11.3.1 20220421 (Red Hat 11.3.1-2)] jinja version = 2.11.3 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: rhel-7_setup.yml ***************************************************** 1 plays in /cache/rhel-7_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-7_setup.yml:5 Thursday 21 July 2022 14:49:45 +0000 (0:00:00.018) 0:00:00.018 ********* changed: [/cache/rhel-7.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } changed: [/cache/rhel-7.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } changed: [/cache/rhel-7.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } changed: [/cache/rhel-7.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } changed: [/cache/rhel-7.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-7.qcow2 : ok=1 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Thursday 21 July 2022 14:49:47 +0000 (0:00:01.433) 0:00:01.452 ********* =============================================================================== set up internal repositories -------------------------------------------- 1.43s /cache/rhel-7_setup.yml:5 ----------------------------------------------------- statically imported: /tmp/tmpaxjje44y/tests/create-test-file.yml statically imported: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml statically imported: /tmp/tmpaxjje44y/tests/create-test-file.yml statically imported: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml statically imported: /tmp/tmpaxjje44y/tests/create-test-file.yml statically imported: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml statically imported: /tmp/tmpaxjje44y/tests/create-test-file.yml statically imported: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml statically imported: /tmp/tmpaxjje44y/tests/create-test-file.yml statically imported: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml statically imported: /tmp/tmpaxjje44y/tests/create-test-file.yml statically imported: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml PLAYBOOK: tests_luks_nvme_generated.yml **************************************** 2 plays in /tmp/tmpaxjje44y/tests/tests_luks_nvme_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmpaxjje44y/tests/tests_luks_nvme_generated.yml:3 Thursday 21 July 2022 14:49:47 +0000 (0:00:00.050) 0:00:01.502 ********* ok: [/cache/rhel-7.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmpaxjje44y/tests/tests_luks_nvme_generated.yml:7 Thursday 21 July 2022 14:49:48 +0000 (0:00:01.014) 0:00:02.517 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_use_interface": "nvme" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:2 Thursday 21 July 2022 14:49:48 +0000 (0:00:00.052) 0:00:02.570 ********* ok: [/cache/rhel-7.qcow2] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:11 Thursday 21 July 2022 14:49:48 +0000 (0:00:00.737) 0:00:03.307 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:49:49 +0000 (0:00:00.033) 0:00:03.340 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:49:49 +0000 (0:00:00.030) 0:00:03.371 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:49:49 +0000 (0:00:00.417) 0:00:03.789 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:49:49 +0000 (0:00:00.096) 0:00:03.885 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:49:49 +0000 (0:00:00.028) 0:00:03.914 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:49:49 +0000 (0:00:00.028) 0:00:03.942 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:49:49 +0000 (0:00:00.047) 0:00:03.989 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:49:49 +0000 (0:00:00.018) 0:00:04.008 ********* changed: [/cache/rhel-7.qcow2] => { "changed": true, "changes": { "installed": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "rc": 0, "results": [ "Loaded plugins: search-disabled-repos\nResolving Dependencies\n--> Running transaction check\n---> Package libblockdev-crypto.x86_64 0:2.18-5.el7 will be installed\n--> Processing Dependency: libblockdev-utils(x86-64) = 2.18-5.el7 for package: libblockdev-crypto-2.18-5.el7.x86_64\n--> Processing Dependency: libvolume_key.so.1()(64bit) for package: libblockdev-crypto-2.18-5.el7.x86_64\n--> Processing Dependency: libbd_utils.so.2()(64bit) for package: libblockdev-crypto-2.18-5.el7.x86_64\n---> Package libblockdev-dm.x86_64 0:2.18-5.el7 will be installed\n--> Processing Dependency: libdmraid.so.1(Base)(64bit) for package: libblockdev-dm-2.18-5.el7.x86_64\n--> Processing Dependency: dmraid for package: libblockdev-dm-2.18-5.el7.x86_64\n--> Processing Dependency: libdmraid.so.1()(64bit) for package: libblockdev-dm-2.18-5.el7.x86_64\n---> Package libblockdev-lvm.x86_64 0:2.18-5.el7 will be installed\n--> Processing Dependency: lvm2 for package: libblockdev-lvm-2.18-5.el7.x86_64\n--> Processing Dependency: device-mapper-persistent-data for package: libblockdev-lvm-2.18-5.el7.x86_64\n---> Package libblockdev-mdraid.x86_64 0:2.18-5.el7 will be installed\n--> Processing Dependency: mdadm for package: libblockdev-mdraid-2.18-5.el7.x86_64\n--> Processing Dependency: libbytesize.so.1()(64bit) for package: libblockdev-mdraid-2.18-5.el7.x86_64\n---> Package libblockdev-swap.x86_64 0:2.18-5.el7 will be installed\n---> Package python-enum34.noarch 0:1.0.4-1.el7 will be installed\n---> Package python2-blivet3.noarch 1:3.1.3-3.el7 will be installed\n--> Processing Dependency: blivet3-data = 1:3.1.3-3.el7 for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Processing Dependency: python2-bytesize >= 0.3 for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Processing Dependency: python2-blockdev >= 2.17 for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Processing Dependency: pyparted >= 3.9 for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Processing Dependency: python2-hawkey for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Processing Dependency: lsof for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Running transaction check\n---> Package blivet3-data.noarch 1:3.1.3-3.el7 will be installed\n---> Package device-mapper-persistent-data.x86_64 0:0.8.5-3.el7_9.2 will be installed\n--> Processing Dependency: libaio.so.1(LIBAIO_0.4)(64bit) for package: device-mapper-persistent-data-0.8.5-3.el7_9.2.x86_64\n--> Processing Dependency: libaio.so.1(LIBAIO_0.1)(64bit) for package: device-mapper-persistent-data-0.8.5-3.el7_9.2.x86_64\n--> Processing Dependency: libaio.so.1()(64bit) for package: device-mapper-persistent-data-0.8.5-3.el7_9.2.x86_64\n---> Package dmraid.x86_64 0:1.0.0.rc16-28.el7 will be installed\n--> Processing Dependency: libdevmapper-event.so.1.02(Base)(64bit) for package: dmraid-1.0.0.rc16-28.el7.x86_64\n--> Processing Dependency: dmraid-events for package: dmraid-1.0.0.rc16-28.el7.x86_64\n--> Processing Dependency: libdevmapper-event.so.1.02()(64bit) for package: dmraid-1.0.0.rc16-28.el7.x86_64\n---> Package libblockdev-utils.x86_64 0:2.18-5.el7 will be installed\n---> Package libbytesize.x86_64 0:1.2-1.el7 will be installed\n--> Processing Dependency: libmpfr.so.4()(64bit) for package: libbytesize-1.2-1.el7.x86_64\n---> Package lsof.x86_64 0:4.87-6.el7 will be installed\n---> Package lvm2.x86_64 7:2.02.187-6.el7_9.5 will be installed\n--> Processing Dependency: lvm2-libs = 7:2.02.187-6.el7_9.5 for package: 7:lvm2-2.02.187-6.el7_9.5.x86_64\n--> Processing Dependency: liblvm2app.so.2.2(Base)(64bit) for package: 7:lvm2-2.02.187-6.el7_9.5.x86_64\n--> Processing Dependency: liblvm2app.so.2.2()(64bit) for package: 7:lvm2-2.02.187-6.el7_9.5.x86_64\n---> Package mdadm.x86_64 0:4.1-9.el7_9 will be installed\n--> Processing Dependency: libreport-filesystem for package: mdadm-4.1-9.el7_9.x86_64\n---> Package pyparted.x86_64 1:3.9-15.el7 will be installed\n---> Package python2-blockdev.x86_64 0:2.18-5.el7 will be installed\n--> Processing Dependency: libblockdev(x86-64) = 2.18-5.el7 for package: python2-blockdev-2.18-5.el7.x86_64\n---> Package python2-bytesize.x86_64 0:1.2-1.el7 will be installed\n---> Package python2-hawkey.x86_64 0:0.22.5-2.el7_9 will be installed\n--> Processing Dependency: libdnf(x86-64) = 0.22.5-2.el7_9 for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: python2-libdnf = 0.22.5-2.el7_9 for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libsolv.so.0(SOLV_1.0)(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libsolvext.so.0(SOLV_1.0)(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libdnf.so.2()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libjson-glib-1.0.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libmodulemd.so.1()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: librepo.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: librhsm.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libsolv.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libsolvext.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n---> Package volume_key-libs.x86_64 0:0.3.9-9.el7 will be installed\n--> Running transaction check\n---> Package device-mapper-event-libs.x86_64 7:1.02.170-6.el7_9.5 will be installed\n---> Package dmraid-events.x86_64 0:1.0.0.rc16-28.el7 will be installed\n--> Processing Dependency: sgpio for package: dmraid-events-1.0.0.rc16-28.el7.x86_64\n--> Processing Dependency: device-mapper-event for package: dmraid-events-1.0.0.rc16-28.el7.x86_64\n---> Package json-glib.x86_64 0:1.4.2-2.el7 will be installed\n---> Package libaio.x86_64 0:0.3.109-13.el7 will be installed\n---> Package libblockdev.x86_64 0:2.18-5.el7 will be installed\n---> Package libdnf.x86_64 0:0.22.5-2.el7_9 will be installed\n---> Package libmodulemd.x86_64 0:1.6.3-1.el7 will be installed\n---> Package librepo.x86_64 0:1.8.1-8.el7_9 will be installed\n---> Package libreport-filesystem.x86_64 0:2.1.11-53.el7 will be installed\n---> Package librhsm.x86_64 0:0.0.3-3.el7_9 will be installed\n---> Package libsolv.x86_64 0:0.6.34-4.el7 will be installed\n---> Package lvm2-libs.x86_64 7:2.02.187-6.el7_9.5 will be installed\n---> Package mpfr.x86_64 0:3.1.1-4.el7 will be installed\n---> Package python2-libdnf.x86_64 0:0.22.5-2.el7_9 will be installed\n--> Running transaction check\n---> Package device-mapper-event.x86_64 7:1.02.170-6.el7_9.5 will be installed\n---> Package sgpio.x86_64 0:1.2.0.10-13.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nInstalling:\n libblockdev-crypto x86_64 2.18-5.el7 rhel 60 k\n libblockdev-dm x86_64 2.18-5.el7 rhel-optional 54 k\n libblockdev-lvm x86_64 2.18-5.el7 rhel 62 k\n libblockdev-mdraid x86_64 2.18-5.el7 rhel 57 k\n libblockdev-swap x86_64 2.18-5.el7 rhel 52 k\n python-enum34 noarch 1.0.4-1.el7 rhel 52 k\n python2-blivet3 noarch 1:3.1.3-3.el7 rhel 851 k\nInstalling for dependencies:\n blivet3-data noarch 1:3.1.3-3.el7 rhel 77 k\n device-mapper-event\n x86_64 7:1.02.170-6.el7_9.5 rhel 192 k\n device-mapper-event-libs\n x86_64 7:1.02.170-6.el7_9.5 rhel 192 k\n device-mapper-persistent-data\n x86_64 0.8.5-3.el7_9.2 rhel 423 k\n dmraid x86_64 1.0.0.rc16-28.el7 rhel 151 k\n dmraid-events x86_64 1.0.0.rc16-28.el7 rhel 21 k\n json-glib x86_64 1.4.2-2.el7 rhel 134 k\n libaio x86_64 0.3.109-13.el7 rhel 24 k\n libblockdev x86_64 2.18-5.el7 rhel 119 k\n libblockdev-utils x86_64 2.18-5.el7 rhel 59 k\n libbytesize x86_64 1.2-1.el7 rhel 52 k\n libdnf x86_64 0.22.5-2.el7_9 rhel-7-server-extras-rpms 536 k\n libmodulemd x86_64 1.6.3-1.el7 rhel-7-server-extras-rpms 153 k\n librepo x86_64 1.8.1-8.el7_9 rhel 82 k\n libreport-filesystem\n x86_64 2.1.11-53.el7 rhel 41 k\n librhsm x86_64 0.0.3-3.el7_9 rhel-7-server-extras-rpms 28 k\n libsolv x86_64 0.6.34-4.el7 rhel 329 k\n lsof x86_64 4.87-6.el7 rhel 331 k\n lvm2 x86_64 7:2.02.187-6.el7_9.5 rhel 1.3 M\n lvm2-libs x86_64 7:2.02.187-6.el7_9.5 rhel 1.1 M\n mdadm x86_64 4.1-9.el7_9 rhel 440 k\n mpfr x86_64 3.1.1-4.el7 rhel 203 k\n pyparted x86_64 1:3.9-15.el7 rhel 195 k\n python2-blockdev x86_64 2.18-5.el7 rhel 61 k\n python2-bytesize x86_64 1.2-1.el7 rhel 22 k\n python2-hawkey x86_64 0.22.5-2.el7_9 rhel-7-server-extras-rpms 71 k\n python2-libdnf x86_64 0.22.5-2.el7_9 rhel-7-server-extras-rpms 611 k\n sgpio x86_64 1.2.0.10-13.el7 rhel 14 k\n volume_key-libs x86_64 0.3.9-9.el7 rhel 141 k\n\nTransaction Summary\n================================================================================\nInstall 7 Packages (+29 Dependent packages)\n\nTotal download size: 8.2 M\nInstalled size: 24 M\nDownloading packages:\n--------------------------------------------------------------------------------\nTotal 20 MB/s | 8.2 MB 00:00 \nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : libblockdev-utils-2.18-5.el7.x86_64 1/36 \n Installing : 7:device-mapper-event-libs-1.02.170-6.el7_9.5.x86_64 2/36 \n Installing : json-glib-1.4.2-2.el7.x86_64 3/36 \n Installing : librhsm-0.0.3-3.el7_9.x86_64 4/36 \n Installing : libsolv-0.6.34-4.el7.x86_64 5/36 \n Installing : libaio-0.3.109-13.el7.x86_64 6/36 \n Installing : librepo-1.8.1-8.el7_9.x86_64 7/36 \n Installing : libmodulemd-1.6.3-1.el7.x86_64 8/36 \n Installing : libdnf-0.22.5-2.el7_9.x86_64 9/36 \n Installing : device-mapper-persistent-data-0.8.5-3.el7_9.2.x86_64 10/36 \n Installing : 7:device-mapper-event-1.02.170-6.el7_9.5.x86_64 11/36 \n Installing : 7:lvm2-libs-2.02.187-6.el7_9.5.x86_64 12/36 \n Installing : 7:lvm2-2.02.187-6.el7_9.5.x86_64 13/36 \n Installing : python2-libdnf-0.22.5-2.el7_9.x86_64 14/36 \n Installing : python2-hawkey-0.22.5-2.el7_9.x86_64 15/36 \n Installing : libblockdev-2.18-5.el7.x86_64 16/36 \n Installing : python2-blockdev-2.18-5.el7.x86_64 17/36 \n Installing : 1:pyparted-3.9-15.el7.x86_64 18/36 \n Installing : sgpio-1.2.0.10-13.el7.x86_64 19/36 \n Installing : dmraid-1.0.0.rc16-28.el7.x86_64 20/36 \n Installing : dmraid-events-1.0.0.rc16-28.el7.x86_64 21/36 \n Installing : volume_key-libs-0.3.9-9.el7.x86_64 22/36 \n Installing : mpfr-3.1.1-4.el7.x86_64 23/36 \n Installing : libbytesize-1.2-1.el7.x86_64 24/36 \n Installing : python2-bytesize-1.2-1.el7.x86_64 25/36 \n Installing : libreport-filesystem-2.1.11-53.el7.x86_64 26/36 \n Installing : mdadm-4.1-9.el7_9.x86_64 27/36 \n Installing : 1:blivet3-data-3.1.3-3.el7.noarch 28/36 \n Installing : lsof-4.87-6.el7.x86_64 29/36 \n Installing : 1:python2-blivet3-3.1.3-3.el7.noarch 30/36 \n Installing : libblockdev-mdraid-2.18-5.el7.x86_64 31/36 \n Installing : libblockdev-crypto-2.18-5.el7.x86_64 32/36 \n Installing : libblockdev-dm-2.18-5.el7.x86_64 33/36 \n Installing : libblockdev-lvm-2.18-5.el7.x86_64 34/36 \n Installing : libblockdev-swap-2.18-5.el7.x86_64 35/36 \n Installing : python-enum34-1.0.4-1.el7.noarch 36/36 \n Verifying : 7:device-mapper-event-1.02.170-6.el7_9.5.x86_64 1/36 \n Verifying : libblockdev-swap-2.18-5.el7.x86_64 2/36 \n Verifying : librhsm-0.0.3-3.el7_9.x86_64 3/36 \n Verifying : libblockdev-lvm-2.18-5.el7.x86_64 4/36 \n Verifying : lsof-4.87-6.el7.x86_64 5/36 \n Verifying : libblockdev-mdraid-2.18-5.el7.x86_64 6/36 \n Verifying : libdnf-0.22.5-2.el7_9.x86_64 7/36 \n Verifying : python-enum34-1.0.4-1.el7.noarch 8/36 \n Verifying : 1:blivet3-data-3.1.3-3.el7.noarch 9/36 \n Verifying : dmraid-events-1.0.0.rc16-28.el7.x86_64 10/36 \n Verifying : python2-blockdev-2.18-5.el7.x86_64 11/36 \n Verifying : libmodulemd-1.6.3-1.el7.x86_64 12/36 \n Verifying : librepo-1.8.1-8.el7_9.x86_64 13/36 \n Verifying : libblockdev-dm-2.18-5.el7.x86_64 14/36 \n Verifying : json-glib-1.4.2-2.el7.x86_64 15/36 \n Verifying : libaio-0.3.109-13.el7.x86_64 16/36 \n Verifying : 7:lvm2-libs-2.02.187-6.el7_9.5.x86_64 17/36 \n Verifying : python2-hawkey-0.22.5-2.el7_9.x86_64 18/36 \n Verifying : python2-bytesize-1.2-1.el7.x86_64 19/36 \n Verifying : libblockdev-2.18-5.el7.x86_64 20/36 \n Verifying : libreport-filesystem-2.1.11-53.el7.x86_64 21/36 \n Verifying : libbytesize-1.2-1.el7.x86_64 22/36 \n Verifying : 7:device-mapper-event-libs-1.02.170-6.el7_9.5.x86_64 23/36 \n Verifying : python2-libdnf-0.22.5-2.el7_9.x86_64 24/36 \n Verifying : 7:lvm2-2.02.187-6.el7_9.5.x86_64 25/36 \n Verifying : libblockdev-utils-2.18-5.el7.x86_64 26/36 \n Verifying : mpfr-3.1.1-4.el7.x86_64 27/36 \n Verifying : volume_key-libs-0.3.9-9.el7.x86_64 28/36 \n Verifying : libsolv-0.6.34-4.el7.x86_64 29/36 \n Verifying : device-mapper-persistent-data-0.8.5-3.el7_9.2.x86_64 30/36 \n Verifying : 1:python2-blivet3-3.1.3-3.el7.noarch 31/36 \n Verifying : dmraid-1.0.0.rc16-28.el7.x86_64 32/36 \n Verifying : mdadm-4.1-9.el7_9.x86_64 33/36 \n Verifying : sgpio-1.2.0.10-13.el7.x86_64 34/36 \n Verifying : libblockdev-crypto-2.18-5.el7.x86_64 35/36 \n Verifying : 1:pyparted-3.9-15.el7.x86_64 36/36 \n\nInstalled:\n libblockdev-crypto.x86_64 0:2.18-5.el7 libblockdev-dm.x86_64 0:2.18-5.el7 \n libblockdev-lvm.x86_64 0:2.18-5.el7 libblockdev-mdraid.x86_64 0:2.18-5.el7\n libblockdev-swap.x86_64 0:2.18-5.el7 python-enum34.noarch 0:1.0.4-1.el7 \n python2-blivet3.noarch 1:3.1.3-3.el7 \n\nDependency Installed:\n blivet3-data.noarch 1:3.1.3-3.el7 \n device-mapper-event.x86_64 7:1.02.170-6.el7_9.5 \n device-mapper-event-libs.x86_64 7:1.02.170-6.el7_9.5 \n device-mapper-persistent-data.x86_64 0:0.8.5-3.el7_9.2 \n dmraid.x86_64 0:1.0.0.rc16-28.el7 \n dmraid-events.x86_64 0:1.0.0.rc16-28.el7 \n json-glib.x86_64 0:1.4.2-2.el7 \n libaio.x86_64 0:0.3.109-13.el7 \n libblockdev.x86_64 0:2.18-5.el7 \n libblockdev-utils.x86_64 0:2.18-5.el7 \n libbytesize.x86_64 0:1.2-1.el7 \n libdnf.x86_64 0:0.22.5-2.el7_9 \n libmodulemd.x86_64 0:1.6.3-1.el7 \n librepo.x86_64 0:1.8.1-8.el7_9 \n libreport-filesystem.x86_64 0:2.1.11-53.el7 \n librhsm.x86_64 0:0.0.3-3.el7_9 \n libsolv.x86_64 0:0.6.34-4.el7 \n lsof.x86_64 0:4.87-6.el7 \n lvm2.x86_64 7:2.02.187-6.el7_9.5 \n lvm2-libs.x86_64 7:2.02.187-6.el7_9.5 \n mdadm.x86_64 0:4.1-9.el7_9 \n mpfr.x86_64 0:3.1.1-4.el7 \n pyparted.x86_64 1:3.9-15.el7 \n python2-blockdev.x86_64 0:2.18-5.el7 \n python2-bytesize.x86_64 0:1.2-1.el7 \n python2-hawkey.x86_64 0:0.22.5-2.el7_9 \n python2-libdnf.x86_64 0:0.22.5-2.el7_9 \n sgpio.x86_64 0:1.2.0.10-13.el7 \n volume_key-libs.x86_64 0:0.3.9-9.el7 \n\nComplete!\n" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:49:58 +0000 (0:00:08.968) 0:00:12.976 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:49:58 +0000 (0:00:00.042) 0:00:13.018 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:49:58 +0000 (0:00:00.032) 0:00:13.051 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:49:59 +0000 (0:00:00.660) 0:00:13.711 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:49:59 +0000 (0:00:00.041) 0:00:13.752 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:49:59 +0000 (0:00:00.032) 0:00:13.784 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:49:59 +0000 (0:00:00.036) 0:00:13.821 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:49:59 +0000 (0:00:00.028) 0:00:13.850 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:50:00 +0000 (0:00:00.526) 0:00:14.376 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:50:01 +0000 (0:00:01.157) 0:00:15.534 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:50:01 +0000 (0:00:00.055) 0:00:15.590 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:50:01 +0000 (0:00:00.021) 0:00:15.612 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:50:01 +0000 (0:00:00.495) 0:00:16.107 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:50:01 +0000 (0:00:00.037) 0:00:16.144 ********* TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:50:01 +0000 (0:00:00.021) 0:00:16.165 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:50:01 +0000 (0:00:00.062) 0:00:16.228 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:50:01 +0000 (0:00:00.062) 0:00:16.290 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:50:01 +0000 (0:00:00.034) 0:00:16.325 ********* TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:50:02 +0000 (0:00:00.032) 0:00:16.358 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:50:02 +0000 (0:00:00.020) 0:00:16.379 ********* TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:50:02 +0000 (0:00:00.057) 0:00:16.436 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:50:02 +0000 (0:00:00.021) 0:00:16.458 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658414996.3892953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658201031.524, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 70, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658200515.884, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744071677828413", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:50:02 +0000 (0:00:00.406) 0:00:16.864 ********* TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:50:02 +0000 (0:00:00.021) 0:00:16.886 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:14 Thursday 21 July 2022 14:50:03 +0000 (0:00:00.829) 0:00:17.715 ********* included: /tmp/tmpaxjje44y/tests/get_unused_disk.yml for /cache/rhel-7.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmpaxjje44y/tests/get_unused_disk.yml:2 Thursday 21 July 2022 14:50:03 +0000 (0:00:00.033) 0:00:17.748 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "disks": [ "nvme1n1" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmpaxjje44y/tests/get_unused_disk.yml:9 Thursday 21 July 2022 14:50:03 +0000 (0:00:00.460) 0:00:18.209 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "unused_disks": [ "nvme1n1" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmpaxjje44y/tests/get_unused_disk.yml:14 Thursday 21 July 2022 14:50:03 +0000 (0:00:00.034) 0:00:18.244 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmpaxjje44y/tests/get_unused_disk.yml:19 Thursday 21 July 2022 14:50:03 +0000 (0:00:00.037) 0:00:18.281 ********* ok: [/cache/rhel-7.qcow2] => { "unused_disks": [ "nvme1n1" ] } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:24 Thursday 21 July 2022 14:50:03 +0000 (0:00:00.035) 0:00:18.317 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:50:04 +0000 (0:00:00.037) 0:00:18.354 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:50:04 +0000 (0:00:00.031) 0:00:18.386 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:50:04 +0000 (0:00:00.403) 0:00:18.790 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:50:04 +0000 (0:00:00.090) 0:00:18.880 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:50:04 +0000 (0:00:00.033) 0:00:18.914 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:50:04 +0000 (0:00:00.032) 0:00:18.946 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:50:04 +0000 (0:00:00.045) 0:00:18.992 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:50:04 +0000 (0:00:00.020) 0:00:19.012 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:50:05 +0000 (0:00:00.690) 0:00:19.703 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:50:05 +0000 (0:00:00.036) 0:00:19.739 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": [ { "disks": [ "nvme1n1" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:50:05 +0000 (0:00:00.038) 0:00:19.777 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:50:06 +0000 (0:00:00.904) 0:00:20.682 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:50:06 +0000 (0:00:00.042) 0:00:20.724 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:50:06 +0000 (0:00:00.035) 0:00:20.760 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:50:06 +0000 (0:00:00.039) 0:00:20.799 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:50:06 +0000 (0:00:00.034) 0:00:20.834 ********* changed: [/cache/rhel-7.qcow2] => { "changed": true, "changes": { "installed": [ "cryptsetup" ] }, "rc": 0, "results": [ "Loaded plugins: search-disabled-repos\nResolving Dependencies\n--> Running transaction check\n---> Package cryptsetup.x86_64 0:2.0.3-6.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nInstalling:\n cryptsetup x86_64 2.0.3-6.el7 rhel 154 k\n\nTransaction Summary\n================================================================================\nInstall 1 Package\n\nTotal download size: 154 k\nInstalled size: 354 k\nDownloading packages:\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : cryptsetup-2.0.3-6.el7.x86_64 1/1 \n Verifying : cryptsetup-2.0.3-6.el7.x86_64 1/1 \n\nInstalled:\n cryptsetup.x86_64 0:2.0.3-6.el7 \n\nComplete!\n" ] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:50:07 +0000 (0:00:01.488) 0:00:22.322 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:50:08 +0000 (0:00:00.992) 0:00:23.315 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:50:09 +0000 (0:00:00.053) 0:00:23.368 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:50:09 +0000 (0:00:00.020) 0:00:23.388 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Thursday 21 July 2022 14:50:09 +0000 (0:00:00.884) 0:00:24.273 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': True, 'pools': [], 'volumes': [{'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 10737418240, 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, 'encryption': True, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'type': 'disk', 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, 'raid_spare_count': None, 'name': 'foo', 'cache_mode': None, 'cache_devices': [], 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': None, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'thin_pool_size': None, 'fs_create_options': ''}], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "encrypted volume 'foo' missing key/password", '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:50:09 +0000 (0:00:00.039) 0:00:24.313 ********* TASK [Check that we failed in the role] **************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:40 Thursday 21 July 2022 14:50:09 +0000 (0:00:00.021) 0:00:24.334 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the keyless luks test] ****************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:46 Thursday 21 July 2022 14:50:10 +0000 (0:00:00.035) 0:00:24.370 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:54 Thursday 21 July 2022 14:50:10 +0000 (0:00:00.045) 0:00:24.415 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:50:10 +0000 (0:00:00.036) 0:00:24.452 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:50:10 +0000 (0:00:00.031) 0:00:24.483 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:50:10 +0000 (0:00:00.411) 0:00:24.895 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:50:10 +0000 (0:00:00.087) 0:00:24.982 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:50:10 +0000 (0:00:00.062) 0:00:25.045 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:50:10 +0000 (0:00:00.033) 0:00:25.078 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:50:10 +0000 (0:00:00.047) 0:00:25.126 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:50:10 +0000 (0:00:00.051) 0:00:25.178 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:50:11 +0000 (0:00:00.661) 0:00:25.839 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:50:11 +0000 (0:00:00.033) 0:00:25.873 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": [ { "disks": [ "nvme1n1" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:50:11 +0000 (0:00:00.035) 0:00:25.908 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:50:12 +0000 (0:00:00.840) 0:00:26.749 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:50:12 +0000 (0:00:00.047) 0:00:26.797 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:50:12 +0000 (0:00:00.033) 0:00:26.831 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:50:12 +0000 (0:00:00.036) 0:00:26.867 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:50:12 +0000 (0:00:00.032) 0:00:26.900 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:50:13 +0000 (0:00:00.518) 0:00:27.418 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:50:14 +0000 (0:00:01.041) 0:00:28.460 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:50:14 +0000 (0:00:00.054) 0:00:28.514 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:50:14 +0000 (0:00:00.019) 0:00:28.534 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1", "name": "luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:50:21 +0000 (0:00:07.420) 0:00:35.954 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:50:21 +0000 (0:00:00.036) 0:00:35.991 ********* TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:50:21 +0000 (0:00:00.019) 0:00:36.011 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1", "name": "luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:50:21 +0000 (0:00:00.067) 0:00:36.078 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:50:21 +0000 (0:00:00.068) 0:00:36.146 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:50:21 +0000 (0:00:00.072) 0:00:36.219 ********* TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:50:21 +0000 (0:00:00.039) 0:00:36.259 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:50:22 +0000 (0:00:00.742) 0:00:37.001 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:50:23 +0000 (0:00:00.500) 0:00:37.501 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:50:23 +0000 (0:00:00.455) 0:00:37.957 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658414996.3892953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658201031.524, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 70, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658200515.884, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744071677828413", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:50:23 +0000 (0:00:00.316) 0:00:38.274 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'present', 'password': '-', 'name': 'luks-c375570c-86a3-4ede-8825-5a93fd755f9a', 'backing_device': '/dev/nvme1n1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/nvme1n1", "name": "luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.467) 0:00:38.741 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:66 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.846) 0:00:39.588 ********* included: /tmp/tmpaxjje44y/tests/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:1 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.036) 0:00:39.624 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:6 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.036) 0:00:39.660 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:14 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.050) 0:00:39.711 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "size": "10G", "type": "crypt", "uuid": "97169c47-4f16-4c90-9b46-7f56e81eb7d8" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "c375570c-86a3-4ede-8825-5a93fd755f9a" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-14-49-33-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:19 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.451) 0:00:40.162 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003524", "end": "2022-07-21 10:50:26.245623", "rc": 0, "start": "2022-07-21 10:50:26.242099" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 /dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:24 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.436) 0:00:40.599 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003827", "end": "2022-07-21 10:50:26.560479", "failed_when_result": false, "rc": 0, "start": "2022-07-21 10:50:26.556652" } STDOUT: luks-c375570c-86a3-4ede-8825-5a93fd755f9a /dev/nvme1n1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:33 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.310) 0:00:40.910 ********* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:43 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.021) 0:00:40.931 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/nvme1n1', 'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', '_device': '/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a', 'size': 10737418240, 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': True, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'type': 'disk', 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a', 'raid_spare_count': None, 'name': 'foo', '_raw_kernel_device': '/dev/nvme1n1', 'cache_mode': None, 'cache_devices': [], 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': None, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'thin_pool_size': None, 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:2 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.056) 0:00:40.988 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:10 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.078) 0:00:41.067 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:6 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.103) 0:00:41.171 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:14 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.039) 0:00:41.210 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2610120, "block_size": 4096, "block_total": 2618368, "block_used": 8248, "device": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "fstype": "xfs", "inode_available": 5241853, "inode_total": 5241856, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10691051520, "size_total": 10724835328, "uuid": "97169c47-4f16-4c90-9b46-7f56e81eb7d8" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2610120, "block_size": 4096, "block_total": 2618368, "block_used": 8248, "device": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "fstype": "xfs", "inode_available": 5241853, "inode_total": 5241856, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10691051520, "size_total": 10724835328, "uuid": "97169c47-4f16-4c90-9b46-7f56e81eb7d8" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:28 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.056) 0:00:41.266 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:37 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.049) 0:00:41.316 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:45 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.048) 0:00:41.365 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:54 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.053) 0:00:41.418 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:58 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.023) 0:00:41.442 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:63 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.022) 0:00:41.465 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:75 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.022) 0:00:41.487 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.034) 0:00:41.522 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.059) 0:00:41.581 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:32 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.047) 0:00:41.629 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:39 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.047) 0:00:41.677 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:49 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.034) 0:00:41.711 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:4 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.034) 0:00:41.746 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:10 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.036) 0:00:41.782 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:4 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.038) 0:00:41.821 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415021.4112952, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415021.4112952, "dev": 5, "device_type": 66305, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 9660, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1658415021.4112952, "nlink": 1, "path": "/dev/nvme1n1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:10 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.326) 0:00:42.147 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:18 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.036) 0:00:42.183 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:24 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.038) 0:00:42.221 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:28 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.038) 0:00:42.260 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:33 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.036) 0:00:42.297 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 14:50:28 +0000 (0:00:00.040) 0:00:42.337 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415021.5532951, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415021.5532951, "dev": 5, "device_type": 64512, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 37237, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658415021.5532951, "nlink": 1, "path": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 14:50:28 +0000 (0:00:00.322) 0:00:42.659 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 14:50:28 +0000 (0:00:00.544) 0:00:43.204 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/nvme1n1" ], "delta": "0:00:00.103649", "end": "2022-07-21 10:50:29.327897", "rc": 0, "start": "2022-07-21 10:50:29.224248" } STDOUT: LUKS header information for /dev/nvme1n1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 9d 5b 9e 09 d8 0b 1a c0 00 5a 4f 7a 3c 03 9b 46 c3 d0 9b e9 MK salt: 87 64 de cf d8 07 ea 4d f0 29 15 0f 20 d8 51 3b f8 ad eb 42 c1 7c 3d 41 88 6f 2d a9 51 9f cb 38 MK iterations: 22850 UUID: c375570c-86a3-4ede-8825-5a93fd755f9a Key Slot 0: ENABLED Iterations: 365102 Salt: 22 69 5d 7b 7c 18 b6 56 33 e6 76 d3 5c 96 b9 f8 f3 41 be 72 4b c9 d0 31 75 72 83 a0 a7 6f e7 54 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 14:50:29 +0000 (0:00:00.478) 0:00:43.682 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:30 Thursday 21 July 2022 14:50:29 +0000 (0:00:00.037) 0:00:43.720 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:38 Thursday 21 July 2022 14:50:29 +0000 (0:00:00.048) 0:00:43.768 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 14:50:29 +0000 (0:00:00.039) 0:00:43.808 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:49 Thursday 21 July 2022 14:50:29 +0000 (0:00:00.037) 0:00:43.846 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:55 Thursday 21 July 2022 14:50:29 +0000 (0:00:00.022) 0:00:43.868 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:61 Thursday 21 July 2022 14:50:29 +0000 (0:00:00.021) 0:00:43.890 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 14:50:29 +0000 (0:00:00.022) 0:00:43.912 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-c375570c-86a3-4ede-8825-5a93fd755f9a /dev/nvme1n1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:74 Thursday 21 July 2022 14:50:29 +0000 (0:00:00.048) 0:00:43.961 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:79 Thursday 21 July 2022 14:50:29 +0000 (0:00:00.046) 0:00:44.007 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:85 Thursday 21 July 2022 14:50:29 +0000 (0:00:00.050) 0:00:44.057 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:91 Thursday 21 July 2022 14:50:29 +0000 (0:00:00.048) 0:00:44.106 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:97 Thursday 21 July 2022 14:50:29 +0000 (0:00:00.049) 0:00:44.155 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:7 Thursday 21 July 2022 14:50:29 +0000 (0:00:00.035) 0:00:44.191 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:13 Thursday 21 July 2022 14:50:29 +0000 (0:00:00.039) 0:00:44.231 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:17 Thursday 21 July 2022 14:50:29 +0000 (0:00:00.037) 0:00:44.269 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:21 Thursday 21 July 2022 14:50:29 +0000 (0:00:00.033) 0:00:44.303 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:25 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.037) 0:00:44.341 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:31 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.039) 0:00:44.380 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:37 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.036) 0:00:44.417 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:3 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.040) 0:00:44.457 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:9 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.026) 0:00:44.484 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:15 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.038) 0:00:44.522 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:20 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.035) 0:00:44.557 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:25 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.034) 0:00:44.591 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:28 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.034) 0:00:44.626 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:31 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.041) 0:00:44.667 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:36 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.036) 0:00:44.703 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:39 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.034) 0:00:44.738 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:44 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.034) 0:00:44.772 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:47 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.037) 0:00:44.810 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:50 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.036) 0:00:44.846 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:6 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.024) 0:00:44.871 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:14 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.023) 0:00:44.895 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:17 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.025) 0:00:44.920 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:22 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.024) 0:00:44.944 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:26 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.024) 0:00:44.969 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:32 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.022) 0:00:44.991 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:36 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.022) 0:00:45.014 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:16 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.021) 0:00:45.035 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:53 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.062) 0:00:45.098 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmpaxjje44y/tests/create-test-file.yml:10 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.065) 0:00:45.163 ********* changed: [/cache/rhel-7.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:72 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.479) 0:00:45.642 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.063) 0:00:45.706 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.035) 0:00:45.741 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.419) 0:00:46.161 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.059) 0:00:46.221 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.033) 0:00:46.255 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.034) 0:00:46.289 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:50:32 +0000 (0:00:00.048) 0:00:46.337 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:50:32 +0000 (0:00:00.022) 0:00:46.360 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:50:32 +0000 (0:00:00.678) 0:00:47.038 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:50:32 +0000 (0:00:00.036) 0:00:47.074 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:50:32 +0000 (0:00:00.042) 0:00:47.117 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:50:33 +0000 (0:00:00.964) 0:00:48.082 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:50:33 +0000 (0:00:00.045) 0:00:48.127 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:50:33 +0000 (0:00:00.033) 0:00:48.161 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:50:33 +0000 (0:00:00.036) 0:00:48.198 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:50:33 +0000 (0:00:00.031) 0:00:48.229 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:50:34 +0000 (0:00:00.527) 0:00:48.756 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:50:35 +0000 (0:00:01.024) 0:00:49.781 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:50:35 +0000 (0:00:00.108) 0:00:49.889 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:50:35 +0000 (0:00:00.023) 0:00:49.912 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-c375570c-86a3-4ede-8825-5a93fd755f9a' in safe mode due to encryption removal TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Thursday 21 July 2022 14:50:36 +0000 (0:00:00.976) 0:00:50.889 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': True, 'pools': [], 'volumes': [{'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 10735321088, 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'type': 'disk', 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, 'raid_spare_count': None, 'name': 'foo', 'cache_mode': None, 'cache_devices': [], 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': None, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'thin_pool_size': None, 'fs_create_options': ''}], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "cannot remove existing formatting on device 'luks-c375570c-86a3-4ede-8825-5a93fd755f9a' in safe mode due to encryption removal", '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:50:36 +0000 (0:00:00.037) 0:00:50.927 ********* TASK [Check that we failed in the role] **************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:87 Thursday 21 July 2022 14:50:36 +0000 (0:00:00.020) 0:00:50.947 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:93 Thursday 21 July 2022 14:50:36 +0000 (0:00:00.035) 0:00:50.982 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml:10 Thursday 21 July 2022 14:50:36 +0000 (0:00:00.047) 0:00:51.030 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415031.2882953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658415031.2882953, "dev": 64512, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1658415031.2882953, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744072758372484", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml:15 Thursday 21 July 2022 14:50:37 +0000 (0:00:00.320) 0:00:51.351 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:104 Thursday 21 July 2022 14:50:37 +0000 (0:00:00.041) 0:00:51.392 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:50:37 +0000 (0:00:00.042) 0:00:51.435 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:50:37 +0000 (0:00:00.034) 0:00:51.469 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:50:37 +0000 (0:00:00.409) 0:00:51.879 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:50:37 +0000 (0:00:00.061) 0:00:51.941 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:50:37 +0000 (0:00:00.035) 0:00:51.976 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:50:37 +0000 (0:00:00.035) 0:00:52.012 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:50:37 +0000 (0:00:00.048) 0:00:52.060 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:50:37 +0000 (0:00:00.019) 0:00:52.079 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:50:38 +0000 (0:00:00.705) 0:00:52.785 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:50:38 +0000 (0:00:00.033) 0:00:52.819 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:50:38 +0000 (0:00:00.065) 0:00:52.884 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:50:39 +0000 (0:00:00.968) 0:00:53.853 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:50:39 +0000 (0:00:00.086) 0:00:53.939 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:50:39 +0000 (0:00:00.034) 0:00:53.974 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:50:39 +0000 (0:00:00.042) 0:00:54.017 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:50:39 +0000 (0:00:00.035) 0:00:54.052 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:50:40 +0000 (0:00:00.528) 0:00:54.581 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:50:41 +0000 (0:00:01.000) 0:00:55.582 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:50:41 +0000 (0:00:00.065) 0:00:55.647 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:50:41 +0000 (0:00:00.020) 0:00:55.668 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1", "name": "luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=afdad55c-ae9a-45e2-b12b-24006796ea3a", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/nvme1n1", "_kernel_device": "/dev/nvme1n1", "_mount_id": "UUID=afdad55c-ae9a-45e2-b12b-24006796ea3a", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10735321088, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:50:42 +0000 (0:00:01.406) 0:00:57.075 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:50:42 +0000 (0:00:00.035) 0:00:57.110 ********* TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:50:42 +0000 (0:00:00.019) 0:00:57.130 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1", "name": "luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=afdad55c-ae9a-45e2-b12b-24006796ea3a", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/nvme1n1", "_kernel_device": "/dev/nvme1n1", "_mount_id": "UUID=afdad55c-ae9a-45e2-b12b-24006796ea3a", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10735321088, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:50:42 +0000 (0:00:00.036) 0:00:57.167 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:50:42 +0000 (0:00:00.035) 0:00:57.202 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/nvme1n1", "_kernel_device": "/dev/nvme1n1", "_mount_id": "UUID=afdad55c-ae9a-45e2-b12b-24006796ea3a", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10735321088, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:50:42 +0000 (0:00:00.034) 0:00:57.237 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-c375570c-86a3-4ede-8825-5a93fd755f9a" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:50:43 +0000 (0:00:00.341) 0:00:57.579 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:50:43 +0000 (0:00:00.474) 0:00:58.054 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': 'UUID=afdad55c-ae9a-45e2-b12b-24006796ea3a', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=afdad55c-ae9a-45e2-b12b-24006796ea3a", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=afdad55c-ae9a-45e2-b12b-24006796ea3a" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:50:44 +0000 (0:00:00.397) 0:00:58.452 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:50:44 +0000 (0:00:00.457) 0:00:58.909 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415026.5582952, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "bfeee4f963ee421dcf38804fca1ac74f95fd65ee", "ctime": 1658415024.3802953, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 12585524, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1658415024.3802953, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 57, "uid": 0, "version": "1222817897", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:50:44 +0000 (0:00:00.353) 0:00:59.263 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'absent', 'password': '-', 'name': 'luks-c375570c-86a3-4ede-8825-5a93fd755f9a', 'backing_device': '/dev/nvme1n1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/nvme1n1", "name": "luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:50:45 +0000 (0:00:00.365) 0:00:59.628 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:117 Thursday 21 July 2022 14:50:46 +0000 (0:00:00.875) 0:01:00.504 ********* included: /tmp/tmpaxjje44y/tests/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:1 Thursday 21 July 2022 14:50:46 +0000 (0:00:00.037) 0:01:00.541 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:6 Thursday 21 July 2022 14:50:46 +0000 (0:00:00.036) 0:01:00.578 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/nvme1n1", "_kernel_device": "/dev/nvme1n1", "_mount_id": "UUID=afdad55c-ae9a-45e2-b12b-24006796ea3a", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10735321088, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:14 Thursday 21 July 2022 14:50:46 +0000 (0:00:00.051) 0:01:00.629 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "xfs", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "afdad55c-ae9a-45e2-b12b-24006796ea3a" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-14-49-33-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:19 Thursday 21 July 2022 14:50:46 +0000 (0:00:00.311) 0:01:00.940 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003523", "end": "2022-07-21 10:50:46.907007", "rc": 0, "start": "2022-07-21 10:50:46.903484" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 UUID=afdad55c-ae9a-45e2-b12b-24006796ea3a /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:24 Thursday 21 July 2022 14:50:46 +0000 (0:00:00.321) 0:01:01.262 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003326", "end": "2022-07-21 10:50:47.231953", "failed_when_result": false, "rc": 0, "start": "2022-07-21 10:50:47.228627" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:33 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.318) 0:01:01.581 ********* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:43 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.021) 0:01:01.602 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/nvme1n1', 'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', '_device': '/dev/nvme1n1', 'size': 10735321088, 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/nvme1n1', 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'type': 'disk', 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': 'UUID=afdad55c-ae9a-45e2-b12b-24006796ea3a', 'raid_spare_count': None, 'name': 'foo', '_raw_kernel_device': '/dev/nvme1n1', 'cache_mode': None, 'cache_devices': [], 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': None, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'thin_pool_size': None, 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:2 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.055) 0:01:01.657 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:10 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.047) 0:01:01.704 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:6 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.076) 0:01:01.781 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/nvme1n1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:14 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.039) 0:01:01.820 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2610632, "block_size": 4096, "block_total": 2618880, "block_used": 8248, "device": "/dev/nvme1n1", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10693148672, "size_total": 10726932480, "uuid": "afdad55c-ae9a-45e2-b12b-24006796ea3a" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2610632, "block_size": 4096, "block_total": 2618880, "block_used": 8248, "device": "/dev/nvme1n1", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10693148672, "size_total": 10726932480, "uuid": "afdad55c-ae9a-45e2-b12b-24006796ea3a" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:28 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.084) 0:01:01.905 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:37 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.079) 0:01:01.985 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:45 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.049) 0:01:02.035 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:54 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.080) 0:01:02.115 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:58 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.023) 0:01:02.138 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:63 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.048) 0:01:02.187 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:75 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.022) 0:01:02.209 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.031) 0:01:02.241 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=afdad55c-ae9a-45e2-b12b-24006796ea3a " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.063) 0:01:02.305 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:32 Thursday 21 July 2022 14:50:48 +0000 (0:00:00.050) 0:01:02.355 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:39 Thursday 21 July 2022 14:50:48 +0000 (0:00:00.048) 0:01:02.404 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:49 Thursday 21 July 2022 14:50:48 +0000 (0:00:00.033) 0:01:02.438 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:4 Thursday 21 July 2022 14:50:48 +0000 (0:00:00.034) 0:01:02.472 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:10 Thursday 21 July 2022 14:50:48 +0000 (0:00:00.035) 0:01:02.507 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:4 Thursday 21 July 2022 14:50:48 +0000 (0:00:00.036) 0:01:02.544 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415042.6862953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415042.6862953, "dev": 5, "device_type": 66305, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 9660, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1658415042.6862953, "nlink": 1, "path": "/dev/nvme1n1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:10 Thursday 21 July 2022 14:50:48 +0000 (0:00:00.331) 0:01:02.876 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:18 Thursday 21 July 2022 14:50:48 +0000 (0:00:00.043) 0:01:02.919 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:24 Thursday 21 July 2022 14:50:48 +0000 (0:00:00.039) 0:01:02.958 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:28 Thursday 21 July 2022 14:50:48 +0000 (0:00:00.037) 0:01:02.996 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:33 Thursday 21 July 2022 14:50:48 +0000 (0:00:00.023) 0:01:03.019 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 14:50:48 +0000 (0:00:00.038) 0:01:03.058 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 14:50:48 +0000 (0:00:00.022) 0:01:03.080 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.547) 0:01:03.627 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.028) 0:01:03.656 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:30 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.027) 0:01:03.684 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:38 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.055) 0:01:03.739 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.023) 0:01:03.763 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:49 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.022) 0:01:03.786 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:55 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.024) 0:01:03.811 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:61 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.023) 0:01:03.835 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.023) 0:01:03.858 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:74 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.049) 0:01:03.908 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:79 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.050) 0:01:03.958 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:85 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.035) 0:01:03.994 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:91 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.037) 0:01:04.032 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:97 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.038) 0:01:04.071 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:7 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.037) 0:01:04.108 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:13 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.037) 0:01:04.146 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:17 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.037) 0:01:04.183 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:21 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.034) 0:01:04.218 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:25 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.035) 0:01:04.254 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:31 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.037) 0:01:04.291 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:37 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.037) 0:01:04.329 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:3 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.037) 0:01:04.367 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:9 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.026) 0:01:04.393 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:15 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.087) 0:01:04.481 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:20 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.037) 0:01:04.519 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:25 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.036) 0:01:04.556 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:28 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.037) 0:01:04.594 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:31 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.036) 0:01:04.630 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:36 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.037) 0:01:04.667 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:39 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.036) 0:01:04.703 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:44 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.037) 0:01:04.741 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:47 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.034) 0:01:04.776 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:50 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.035) 0:01:04.812 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:6 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.023) 0:01:04.836 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:14 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.025) 0:01:04.861 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:17 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.025) 0:01:04.886 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:22 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.022) 0:01:04.909 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:26 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.021) 0:01:04.931 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:32 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.025) 0:01:04.956 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:36 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.024) 0:01:04.981 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:16 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.023) 0:01:05.004 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:53 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.033) 0:01:05.038 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmpaxjje44y/tests/create-test-file.yml:10 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.035) 0:01:05.074 ********* changed: [/cache/rhel-7.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:123 Thursday 21 July 2022 14:50:51 +0000 (0:00:00.321) 0:01:05.395 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:50:51 +0000 (0:00:00.032) 0:01:05.428 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:50:51 +0000 (0:00:00.032) 0:01:05.461 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:50:51 +0000 (0:00:00.406) 0:01:05.867 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:50:51 +0000 (0:00:00.063) 0:01:05.931 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:50:51 +0000 (0:00:00.034) 0:01:05.965 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:50:51 +0000 (0:00:00.032) 0:01:05.998 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:50:51 +0000 (0:00:00.048) 0:01:06.046 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:50:51 +0000 (0:00:00.021) 0:01:06.067 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:50:52 +0000 (0:00:00.709) 0:01:06.777 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:50:52 +0000 (0:00:00.037) 0:01:06.814 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": [ { "disks": [ "nvme1n1" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:50:52 +0000 (0:00:00.035) 0:01:06.850 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:50:53 +0000 (0:00:00.873) 0:01:07.724 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:50:53 +0000 (0:00:00.047) 0:01:07.771 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:50:53 +0000 (0:00:00.035) 0:01:07.806 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:50:53 +0000 (0:00:00.040) 0:01:07.846 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:50:53 +0000 (0:00:00.038) 0:01:07.885 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:50:54 +0000 (0:00:00.532) 0:01:08.417 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2dc375570c\\x2d86a3\\x2d4ede\\x2d8825\\x2d5a93fd755f9a.service": { "name": "systemd-cryptsetup@luks\\x2dc375570c\\x2d86a3\\x2d4ede\\x2d8825\\x2d5a93fd755f9a.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:50:55 +0000 (0:00:01.000) 0:01:09.417 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dc375570c\\x2d86a3\\x2d4ede\\x2d8825\\x2d5a93fd755f9a.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.059) 0:01:09.476 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2dc375570c\x2d86a3\x2d4ede\x2d8825\x2d5a93fd755f9a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc375570c\\x2d86a3\\x2d4ede\\x2d8825\\x2d5a93fd755f9a.service", "name": "systemd-cryptsetup@luks\\x2dc375570c\\x2d86a3\\x2d4ede\\x2d8825\\x2d5a93fd755f9a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service systemd-readahead-replay.service cryptsetup-pre.target dev-nvme1n1.device system-systemd\\x2dcryptsetup.slice systemd-journald.socket", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-nvme1n1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-c375570c-86a3-4ede-8825-5a93fd755f9a", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-c375570c-86a3-4ede-8825-5a93fd755f9a /dev/nvme1n1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-c375570c-86a3-4ede-8825-5a93fd755f9a ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dc375570c\\x2d86a3\\x2d4ede\\x2d8825\\x2d5a93fd755f9a.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dc375570c\\x2d86a3\\x2d4ede\\x2d8825\\x2d5a93fd755f9a.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dc375570c\\x2d86a3\\x2d4ede\\x2d8825\\x2d5a93fd755f9a.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-nvme1n1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.494) 0:01:09.970 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'nvme1n1' in safe mode due to adding encryption TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.882) 0:01:10.853 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': True, 'pools': [], 'volumes': [{'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 10737418240, 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', 'encryption': True, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'type': 'disk', 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, 'raid_spare_count': None, 'name': 'foo', 'cache_mode': None, 'cache_devices': [], 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': None, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'thin_pool_size': None, 'fs_create_options': ''}], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "cannot remove existing formatting on device 'nvme1n1' in safe mode due to adding encryption", '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.042) 0:01:10.895 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2dc375570c\x2d86a3\x2d4ede\x2d8825\x2d5a93fd755f9a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc375570c\\x2d86a3\\x2d4ede\\x2d8825\\x2d5a93fd755f9a.service", "name": "systemd-cryptsetup@luks\\x2dc375570c\\x2d86a3\\x2d4ede\\x2d8825\\x2d5a93fd755f9a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dc375570c\\x2d86a3\\x2d4ede\\x2d8825\\x2d5a93fd755f9a.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dc375570c\\x2d86a3\\x2d4ede\\x2d8825\\x2d5a93fd755f9a.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dc375570c\\x2d86a3\\x2d4ede\\x2d8825\\x2d5a93fd755f9a.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:138 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.478) 0:01:11.373 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:144 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.037) 0:01:11.411 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml:10 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.109) 0:01:11.520 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415051.0422952, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658415051.0422952, "dev": 66305, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1658415051.0422952, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744072465900172", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml:15 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.323) 0:01:11.844 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:155 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.038) 0:01:11.882 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.040) 0:01:11.923 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.073) 0:01:11.997 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:50:58 +0000 (0:00:00.429) 0:01:12.426 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:50:58 +0000 (0:00:00.060) 0:01:12.486 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:50:58 +0000 (0:00:00.032) 0:01:12.519 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:50:58 +0000 (0:00:00.033) 0:01:12.553 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:50:58 +0000 (0:00:00.045) 0:01:12.599 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:50:58 +0000 (0:00:00.019) 0:01:12.618 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:50:58 +0000 (0:00:00.688) 0:01:13.307 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:50:59 +0000 (0:00:00.037) 0:01:13.344 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": [ { "disks": [ "nvme1n1" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:50:59 +0000 (0:00:00.039) 0:01:13.383 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:50:59 +0000 (0:00:00.882) 0:01:14.266 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:50:59 +0000 (0:00:00.044) 0:01:14.310 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.032) 0:01:14.342 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.036) 0:01:14.379 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.033) 0:01:14.413 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.541) 0:01:14.954 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:51:01 +0000 (0:00:01.053) 0:01:16.008 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:51:01 +0000 (0:00:00.088) 0:01:16.097 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:51:01 +0000 (0:00:00.053) 0:01:16.150 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1", "name": "luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=afdad55c-ae9a-45e2-b12b-24006796ea3a", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:51:09 +0000 (0:00:07.428) 0:01:23.578 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:51:09 +0000 (0:00:00.036) 0:01:23.615 ********* TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:51:09 +0000 (0:00:00.022) 0:01:23.637 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1", "name": "luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=afdad55c-ae9a-45e2-b12b-24006796ea3a", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:51:09 +0000 (0:00:00.037) 0:01:23.674 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:51:09 +0000 (0:00:00.036) 0:01:23.711 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:51:09 +0000 (0:00:00.038) 0:01:23.749 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': 'UUID=afdad55c-ae9a-45e2-b12b-24006796ea3a', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=afdad55c-ae9a-45e2-b12b-24006796ea3a", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=afdad55c-ae9a-45e2-b12b-24006796ea3a" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:51:09 +0000 (0:00:00.344) 0:01:24.093 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:51:10 +0000 (0:00:00.478) 0:01:24.572 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:51:10 +0000 (0:00:00.366) 0:01:24.939 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:51:11 +0000 (0:00:00.459) 0:01:25.398 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415047.2302952, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658415045.2692952, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 16563, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658415045.2682953, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "312950355", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:51:11 +0000 (0:00:00.315) 0:01:25.713 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'present', 'password': '-', 'name': 'luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954', 'backing_device': '/dev/nvme1n1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/nvme1n1", "name": "luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:51:11 +0000 (0:00:00.335) 0:01:26.049 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:168 Thursday 21 July 2022 14:51:12 +0000 (0:00:00.853) 0:01:26.902 ********* included: /tmp/tmpaxjje44y/tests/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:1 Thursday 21 July 2022 14:51:12 +0000 (0:00:00.035) 0:01:26.938 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:6 Thursday 21 July 2022 14:51:12 +0000 (0:00:00.035) 0:01:26.973 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:14 Thursday 21 July 2022 14:51:12 +0000 (0:00:00.079) 0:01:27.052 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "size": "10G", "type": "crypt", "uuid": "473784fd-1c97-426c-aef7-c199912e1f12" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "8c0d3e81-0b87-4ffc-8a51-d44c5b359954" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-14-49-33-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:19 Thursday 21 July 2022 14:51:13 +0000 (0:00:00.321) 0:01:27.373 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003516", "end": "2022-07-21 10:51:13.327968", "rc": 0, "start": "2022-07-21 10:51:13.324452" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 /dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:24 Thursday 21 July 2022 14:51:13 +0000 (0:00:00.306) 0:01:27.680 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003268", "end": "2022-07-21 10:51:13.631850", "failed_when_result": false, "rc": 0, "start": "2022-07-21 10:51:13.628582" } STDOUT: luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954 /dev/nvme1n1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:33 Thursday 21 July 2022 14:51:13 +0000 (0:00:00.302) 0:01:27.982 ********* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:43 Thursday 21 July 2022 14:51:13 +0000 (0:00:00.024) 0:01:28.007 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/nvme1n1', 'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', '_device': '/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954', 'size': 10737418240, 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': True, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'type': 'disk', 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954', 'raid_spare_count': None, 'name': 'foo', '_raw_kernel_device': '/dev/nvme1n1', 'cache_mode': None, 'cache_devices': [], 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': None, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'thin_pool_size': None, 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:2 Thursday 21 July 2022 14:51:13 +0000 (0:00:00.063) 0:01:28.070 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:10 Thursday 21 July 2022 14:51:13 +0000 (0:00:00.048) 0:01:28.119 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:6 Thursday 21 July 2022 14:51:13 +0000 (0:00:00.071) 0:01:28.190 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:14 Thursday 21 July 2022 14:51:13 +0000 (0:00:00.038) 0:01:28.229 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2610120, "block_size": 4096, "block_total": 2618368, "block_used": 8248, "device": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "fstype": "xfs", "inode_available": 5241853, "inode_total": 5241856, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10691051520, "size_total": 10724835328, "uuid": "473784fd-1c97-426c-aef7-c199912e1f12" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2610120, "block_size": 4096, "block_total": 2618368, "block_used": 8248, "device": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "fstype": "xfs", "inode_available": 5241853, "inode_total": 5241856, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10691051520, "size_total": 10724835328, "uuid": "473784fd-1c97-426c-aef7-c199912e1f12" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:28 Thursday 21 July 2022 14:51:13 +0000 (0:00:00.050) 0:01:28.280 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:37 Thursday 21 July 2022 14:51:13 +0000 (0:00:00.046) 0:01:28.326 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:45 Thursday 21 July 2022 14:51:14 +0000 (0:00:00.050) 0:01:28.376 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:54 Thursday 21 July 2022 14:51:14 +0000 (0:00:00.050) 0:01:28.427 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:58 Thursday 21 July 2022 14:51:14 +0000 (0:00:00.024) 0:01:28.451 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:63 Thursday 21 July 2022 14:51:14 +0000 (0:00:00.025) 0:01:28.477 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:75 Thursday 21 July 2022 14:51:14 +0000 (0:00:00.025) 0:01:28.502 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 14:51:14 +0000 (0:00:00.034) 0:01:28.536 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 14:51:14 +0000 (0:00:00.063) 0:01:28.600 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:32 Thursday 21 July 2022 14:51:14 +0000 (0:00:00.047) 0:01:28.647 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:39 Thursday 21 July 2022 14:51:14 +0000 (0:00:00.046) 0:01:28.694 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:49 Thursday 21 July 2022 14:51:14 +0000 (0:00:00.035) 0:01:28.729 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:4 Thursday 21 July 2022 14:51:14 +0000 (0:00:00.035) 0:01:28.764 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:10 Thursday 21 July 2022 14:51:14 +0000 (0:00:00.039) 0:01:28.803 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:4 Thursday 21 July 2022 14:51:14 +0000 (0:00:00.037) 0:01:28.841 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415069.0372953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415069.0372953, "dev": 5, "device_type": 66305, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 9660, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1658415069.0372953, "nlink": 1, "path": "/dev/nvme1n1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:10 Thursday 21 July 2022 14:51:14 +0000 (0:00:00.376) 0:01:29.218 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:18 Thursday 21 July 2022 14:51:14 +0000 (0:00:00.036) 0:01:29.255 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:24 Thursday 21 July 2022 14:51:14 +0000 (0:00:00.035) 0:01:29.290 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:28 Thursday 21 July 2022 14:51:14 +0000 (0:00:00.034) 0:01:29.325 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:33 Thursday 21 July 2022 14:51:15 +0000 (0:00:00.022) 0:01:29.347 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 14:51:15 +0000 (0:00:00.035) 0:01:29.382 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415069.1762953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415069.1762953, "dev": 5, "device_type": 64512, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 53334, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658415069.1762953, "nlink": 1, "path": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 14:51:15 +0000 (0:00:00.313) 0:01:29.696 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 14:51:15 +0000 (0:00:00.519) 0:01:30.215 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/nvme1n1" ], "delta": "0:00:00.034994", "end": "2022-07-21 10:51:16.209166", "rc": 0, "start": "2022-07-21 10:51:16.174172" } STDOUT: LUKS header information for /dev/nvme1n1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 4a 12 d4 4e 2d 27 23 bf b2 04 08 c0 e9 24 fe 93 f0 ff 4d f6 MK salt: 65 0a 26 65 2c ae 6e 4d 08 83 0b 86 51 b3 7c 90 1b 9b 91 d8 33 15 e9 0f 30 47 9c ba 82 1a 00 81 MK iterations: 22505 UUID: 8c0d3e81-0b87-4ffc-8a51-d44c5b359954 Key Slot 0: ENABLED Iterations: 359592 Salt: 35 d7 d7 76 62 a7 f5 f5 cf 3b 62 bf 28 5f 6a a9 35 6b 81 73 d1 29 e8 b8 05 73 0b eb 7f b5 97 03 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.349) 0:01:30.565 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:30 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.038) 0:01:30.603 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:38 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.049) 0:01:30.652 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.037) 0:01:30.689 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:49 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.037) 0:01:30.727 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:55 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.022) 0:01:30.750 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:61 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.022) 0:01:30.772 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.024) 0:01:30.797 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954 /dev/nvme1n1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:74 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.055) 0:01:30.853 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:79 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.048) 0:01:30.902 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:85 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.047) 0:01:30.950 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:91 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.048) 0:01:30.998 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:97 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.047) 0:01:31.045 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:7 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.035) 0:01:31.080 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:13 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.037) 0:01:31.118 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:17 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.040) 0:01:31.158 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:21 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.036) 0:01:31.195 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:25 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.037) 0:01:31.233 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:31 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.041) 0:01:31.274 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:37 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.045) 0:01:31.320 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:3 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.038) 0:01:31.358 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:9 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.025) 0:01:31.383 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:15 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.035) 0:01:31.419 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:20 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.041) 0:01:31.461 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:25 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.034) 0:01:31.495 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:28 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.034) 0:01:31.529 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:31 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.038) 0:01:31.567 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:36 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.043) 0:01:31.611 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:39 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.035) 0:01:31.646 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:44 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.033) 0:01:31.680 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:47 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.085) 0:01:31.765 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:50 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.075) 0:01:31.840 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:6 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.025) 0:01:31.865 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:14 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.025) 0:01:31.891 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:17 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.026) 0:01:31.917 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:22 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.024) 0:01:31.942 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:26 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.024) 0:01:31.967 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:32 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.024) 0:01:31.991 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:36 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.026) 0:01:32.017 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:16 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.022) 0:01:32.040 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:53 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.032) 0:01:32.073 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:176 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.032) 0:01:32.106 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.040) 0:01:32.146 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.035) 0:01:32.182 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:51:18 +0000 (0:00:00.414) 0:01:32.597 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:51:18 +0000 (0:00:00.066) 0:01:32.663 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:51:18 +0000 (0:00:00.036) 0:01:32.700 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:51:18 +0000 (0:00:00.035) 0:01:32.736 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:51:18 +0000 (0:00:00.046) 0:01:32.782 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:51:18 +0000 (0:00:00.020) 0:01:32.803 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:51:19 +0000 (0:00:00.707) 0:01:33.510 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:51:19 +0000 (0:00:00.036) 0:01:33.546 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:51:19 +0000 (0:00:00.034) 0:01:33.581 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:51:20 +0000 (0:00:00.983) 0:01:34.565 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:51:20 +0000 (0:00:00.048) 0:01:34.613 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:51:20 +0000 (0:00:00.066) 0:01:34.680 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:51:20 +0000 (0:00:00.040) 0:01:34.721 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:51:20 +0000 (0:00:00.036) 0:01:34.757 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:51:20 +0000 (0:00:00.535) 0:01:35.293 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:51:21 +0000 (0:00:01.020) 0:01:36.313 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:51:22 +0000 (0:00:00.059) 0:01:36.373 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:51:22 +0000 (0:00:00.022) 0:01:36.396 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Thursday 21 July 2022 14:51:23 +0000 (0:00:01.028) 0:01:37.424 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': False, 'pools': [{'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, 'raid_spare_count': None, 'raid_disks': [], 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'partition', 'encryption_cipher': None, 'raid_spare_count': None}], 'volumes': [], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "encrypted volume 'test1' missing key/password", '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:51:23 +0000 (0:00:00.045) 0:01:37.469 ********* TASK [Check that we failed in the role] **************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:197 Thursday 21 July 2022 14:51:23 +0000 (0:00:00.022) 0:01:37.492 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the keyless luks test] ****************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:203 Thursday 21 July 2022 14:51:23 +0000 (0:00:00.039) 0:01:37.532 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:210 Thursday 21 July 2022 14:51:23 +0000 (0:00:00.052) 0:01:37.584 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:51:23 +0000 (0:00:00.038) 0:01:37.622 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:51:23 +0000 (0:00:00.033) 0:01:37.656 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:51:23 +0000 (0:00:00.409) 0:01:38.065 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:51:23 +0000 (0:00:00.062) 0:01:38.128 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:51:23 +0000 (0:00:00.032) 0:01:38.161 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:51:23 +0000 (0:00:00.032) 0:01:38.193 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:51:23 +0000 (0:00:00.049) 0:01:38.242 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:51:23 +0000 (0:00:00.020) 0:01:38.263 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.708) 0:01:38.972 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.036) 0:01:39.008 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.072) 0:01:39.080 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.997) 0:01:40.078 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.045) 0:01:40.124 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.035) 0:01:40.159 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.109) 0:01:40.268 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.036) 0:01:40.305 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.552) 0:01:40.857 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:51:27 +0000 (0:00:01.004) 0:01:41.862 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.054) 0:01:41.917 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.020) 0:01:41.937 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/nvme1n1p1", "fs_type": null }, { "action": "create format", "device": "/dev/nvme1n1p1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1", "name": "luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "password": "-", "state": "absent" }, { "backing_device": "/dev/nvme1n1p1", "name": "luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:51:35 +0000 (0:00:07.853) 0:01:49.791 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:51:35 +0000 (0:00:00.035) 0:01:49.827 ********* TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:51:35 +0000 (0:00:00.021) 0:01:49.848 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/nvme1n1p1", "fs_type": null }, { "action": "create format", "device": "/dev/nvme1n1p1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1", "name": "luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "password": "-", "state": "absent" }, { "backing_device": "/dev/nvme1n1p1", "name": "luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:51:35 +0000 (0:00:00.036) 0:01:49.885 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:51:35 +0000 (0:00:00.038) 0:01:49.923 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:51:35 +0000 (0:00:00.039) 0:01:49.963 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:51:36 +0000 (0:00:00.531) 0:01:50.494 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:51:36 +0000 (0:00:00.513) 0:01:51.008 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:51:37 +0000 (0:00:00.349) 0:01:51.358 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:51:37 +0000 (0:00:00.638) 0:01:51.997 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415073.6302953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "586f4b782e46f2a220a89de24b19c53a08e551fa", "ctime": 1658415071.6882951, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 12585524, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1658415071.6882951, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 57, "uid": 0, "version": "18446744071600324051", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:51:37 +0000 (0:00:00.336) 0:01:52.333 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'absent', 'password': '-', 'name': 'luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954', 'backing_device': '/dev/nvme1n1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/nvme1n1", "name": "luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [/cache/rhel-7.qcow2] => (item={'state': 'present', 'password': '-', 'name': 'luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f', 'backing_device': '/dev/nvme1n1p1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/nvme1n1p1", "name": "luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:51:38 +0000 (0:00:00.705) 0:01:53.039 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:227 Thursday 21 July 2022 14:51:40 +0000 (0:00:01.912) 0:01:54.951 ********* included: /tmp/tmpaxjje44y/tests/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:1 Thursday 21 July 2022 14:51:40 +0000 (0:00:00.039) 0:01:54.991 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:6 Thursday 21 July 2022 14:51:40 +0000 (0:00:00.055) 0:01:55.046 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:14 Thursday 21 July 2022 14:51:40 +0000 (0:00:00.036) 0:01:55.082 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "size": "10G", "type": "crypt", "uuid": "67e7585f-d86c-4c55-bda8-817e3170254f" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1p1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/nvme1n1p1", "size": "10G", "type": "partition", "uuid": "b46ee94f-b9a4-4717-8f7b-08a03053da3f" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-14-49-33-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:19 Thursday 21 July 2022 14:51:41 +0000 (0:00:00.323) 0:01:55.406 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003766", "end": "2022-07-21 10:51:41.361178", "rc": 0, "start": "2022-07-21 10:51:41.357412" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 /dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:24 Thursday 21 July 2022 14:51:41 +0000 (0:00:00.324) 0:01:55.730 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003490", "end": "2022-07-21 10:51:41.681149", "failed_when_result": false, "rc": 0, "start": "2022-07-21 10:51:41.677659" } STDOUT: luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f /dev/nvme1n1p1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:33 Thursday 21 July 2022 14:51:41 +0000 (0:00:00.306) 0:01:56.037 ********* included: /tmp/tmpaxjje44y/tests/test-verify-pool.yml for /cache/rhel-7.qcow2 => (item={'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'partition', 'encryption_cipher': None, 'raid_spare_count': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool.yml:5 Thursday 21 July 2022 14:51:41 +0000 (0:00:00.065) 0:01:56.102 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool.yml:18 Thursday 21 July 2022 14:51:41 +0000 (0:00:00.036) 0:01:56.138 ********* included: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml for /cache/rhel-7.qcow2 => (item=members) included: /tmp/tmpaxjje44y/tests/test-verify-pool-volumes.yml for /cache/rhel-7.qcow2 => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:1 Thursday 21 July 2022 14:51:41 +0000 (0:00:00.047) 0:01:56.186 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:10 Thursday 21 July 2022 14:51:41 +0000 (0:00:00.022) 0:01:56.209 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:19 Thursday 21 July 2022 14:51:41 +0000 (0:00:00.023) 0:01:56.232 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:23 Thursday 21 July 2022 14:51:41 +0000 (0:00:00.023) 0:01:56.256 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:27 Thursday 21 July 2022 14:51:41 +0000 (0:00:00.022) 0:01:56.279 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:34 Thursday 21 July 2022 14:51:41 +0000 (0:00:00.022) 0:01:56.302 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:38 Thursday 21 July 2022 14:51:41 +0000 (0:00:00.022) 0:01:56.324 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:42 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.025) 0:01:56.350 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:46 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.027) 0:01:56.377 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check MD RAID] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:56 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.027) 0:01:56.405 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-md.yml for /cache/rhel-7.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:6 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.046) 0:01:56.451 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:12 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.027) 0:01:56.479 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:16 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.023) 0:01:56.502 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:20 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.025) 0:01:56.528 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:24 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.029) 0:01:56.557 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:30 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.023) 0:01:56.581 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:36 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.023) 0:01:56.604 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:44 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.022) 0:01:56.627 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:59 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.143) 0:01:56.770 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-lvmraid.yml for /cache/rhel-7.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.051) 0:01:56.822 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:62 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.032) 0:01:56.854 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-thin.yml for /cache/rhel-7.qcow2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-thin.yml:1 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.049) 0:01:56.903 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:65 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.031) 0:01:56.934 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml for /cache/rhel-7.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.049) 0:01:56.984 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.048) 0:01:57.032 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.025) 0:01:57.058 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.030) 0:01:57.089 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:68 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.035) 0:01:57.124 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-vdo.yml for /cache/rhel-7.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.046) 0:01:57.171 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:71 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.029) 0:01:57.200 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.035) 0:01:57.235 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:2 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.042) 0:01:57.278 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:10 Thursday 21 July 2022 14:51:42 +0000 (0:00:00.052) 0:01:57.331 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:6 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.077) 0:01:57.408 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:14 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.039) 0:01:57.448 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2609864, "block_size": 4096, "block_total": 2618112, "block_used": 8248, "device": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "fstype": "xfs", "inode_available": 5241341, "inode_total": 5241344, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10690002944, "size_total": 10723786752, "uuid": "67e7585f-d86c-4c55-bda8-817e3170254f" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2609864, "block_size": 4096, "block_total": 2618112, "block_used": 8248, "device": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "fstype": "xfs", "inode_available": 5241341, "inode_total": 5241344, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10690002944, "size_total": 10723786752, "uuid": "67e7585f-d86c-4c55-bda8-817e3170254f" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:28 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.058) 0:01:57.507 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:37 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.061) 0:01:57.569 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:45 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.055) 0:01:57.624 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:54 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.054) 0:01:57.678 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:58 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.024) 0:01:57.703 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:63 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.027) 0:01:57.730 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:75 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.027) 0:01:57.758 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.038) 0:01:57.796 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.066) 0:01:57.863 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:32 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.051) 0:01:57.915 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:39 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.052) 0:01:57.967 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:49 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.035) 0:01:58.003 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:4 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.035) 0:01:58.038 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:10 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.036) 0:01:58.075 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:4 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.097) 0:01:58.173 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415095.2672951, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415095.2672951, "dev": 5, "device_type": 66307, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 60060, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1658415095.2672951, "nlink": 1, "path": "/dev/nvme1n1p1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:10 Thursday 21 July 2022 14:51:44 +0000 (0:00:00.369) 0:01:58.542 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:18 Thursday 21 July 2022 14:51:44 +0000 (0:00:00.041) 0:01:58.584 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:24 Thursday 21 July 2022 14:51:44 +0000 (0:00:00.042) 0:01:58.627 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:28 Thursday 21 July 2022 14:51:44 +0000 (0:00:00.040) 0:01:58.668 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:33 Thursday 21 July 2022 14:51:44 +0000 (0:00:00.029) 0:01:58.697 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 14:51:44 +0000 (0:00:00.040) 0:01:58.738 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415095.3912952, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415095.3912952, "dev": 5, "device_type": 64512, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 59197, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658415095.3912952, "nlink": 1, "path": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 14:51:44 +0000 (0:00:00.341) 0:01:59.079 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 14:51:45 +0000 (0:00:00.583) 0:01:59.663 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/nvme1n1p1" ], "delta": "0:00:00.037185", "end": "2022-07-21 10:51:45.696545", "rc": 0, "start": "2022-07-21 10:51:45.659360" } STDOUT: LUKS header information for /dev/nvme1n1p1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: e8 07 fa 6d ac b3 c1 4f 02 89 24 37 f3 b1 17 69 f0 9f 6f e9 MK salt: 84 7e aa bc 80 b0 da 9e ec b8 31 b8 32 08 cb 14 9d 63 13 15 05 53 29 df fc ce ec 28 a0 45 fa 72 MK iterations: 22661 UUID: b46ee94f-b9a4-4717-8f7b-08a03053da3f Key Slot 0: ENABLED Iterations: 361576 Salt: bf ad 54 c5 17 8b 98 c4 03 42 fc 52 83 48 60 69 bb bc 6b c7 c5 0e 4a 2e bb 56 8d fc 44 ac 38 9a Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 14:51:45 +0000 (0:00:00.391) 0:02:00.055 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:30 Thursday 21 July 2022 14:51:45 +0000 (0:00:00.040) 0:02:00.095 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:38 Thursday 21 July 2022 14:51:45 +0000 (0:00:00.055) 0:02:00.151 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 14:51:45 +0000 (0:00:00.039) 0:02:00.191 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:49 Thursday 21 July 2022 14:51:45 +0000 (0:00:00.041) 0:02:00.232 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:55 Thursday 21 July 2022 14:51:45 +0000 (0:00:00.023) 0:02:00.255 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:61 Thursday 21 July 2022 14:51:45 +0000 (0:00:00.021) 0:02:00.277 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 14:51:45 +0000 (0:00:00.022) 0:02:00.300 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f /dev/nvme1n1p1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:74 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.054) 0:02:00.354 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:79 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.048) 0:02:00.403 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:85 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.053) 0:02:00.457 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:91 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.053) 0:02:00.511 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:97 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.051) 0:02:00.562 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:7 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.038) 0:02:00.600 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:13 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.040) 0:02:00.641 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:17 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.039) 0:02:00.681 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:21 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.036) 0:02:00.718 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:25 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.036) 0:02:00.754 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:31 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.042) 0:02:00.796 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:37 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.039) 0:02:00.836 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:3 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.039) 0:02:00.876 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:9 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.026) 0:02:00.902 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:15 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.039) 0:02:00.942 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:20 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.037) 0:02:00.980 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:25 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.077) 0:02:01.057 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:28 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.039) 0:02:01.097 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:31 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.124) 0:02:01.221 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:36 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.043) 0:02:01.265 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:39 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.039) 0:02:01.305 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:44 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.042) 0:02:01.347 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:47 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.036) 0:02:01.384 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:50 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.036) 0:02:01.420 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:6 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.025) 0:02:01.445 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:14 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.027) 0:02:01.473 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:17 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.025) 0:02:01.498 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:22 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.026) 0:02:01.524 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:26 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.027) 0:02:01.552 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:32 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.026) 0:02:01.578 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:36 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.023) 0:02:01.602 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:16 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.022) 0:02:01.625 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:43 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.033) 0:02:01.658 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:53 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.021) 0:02:01.680 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmpaxjje44y/tests/create-test-file.yml:10 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.031) 0:02:01.712 ********* changed: [/cache/rhel-7.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:233 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.321) 0:02:02.034 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.039) 0:02:02.074 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.033) 0:02:02.107 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:51:48 +0000 (0:00:00.424) 0:02:02.532 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:51:48 +0000 (0:00:00.061) 0:02:02.593 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:51:48 +0000 (0:00:00.046) 0:02:02.640 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:51:48 +0000 (0:00:00.039) 0:02:02.680 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:51:48 +0000 (0:00:00.050) 0:02:02.730 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:51:48 +0000 (0:00:00.022) 0:02:02.752 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:51:49 +0000 (0:00:00.707) 0:02:03.460 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:51:49 +0000 (0:00:00.036) 0:02:03.497 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:51:49 +0000 (0:00:00.038) 0:02:03.535 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:51:50 +0000 (0:00:01.054) 0:02:04.590 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:51:50 +0000 (0:00:00.047) 0:02:04.637 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:51:50 +0000 (0:00:00.035) 0:02:04.672 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:51:50 +0000 (0:00:00.040) 0:02:04.713 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:51:50 +0000 (0:00:00.032) 0:02:04.745 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:51:50 +0000 (0:00:00.521) 0:02:05.266 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d8c0d3e81\\x2d0b87\\x2d4ffc\\x2d8a51\\x2dd44c5b359954.service": { "name": "systemd-cryptsetup@luks\\x2d8c0d3e81\\x2d0b87\\x2d4ffc\\x2d8a51\\x2dd44c5b359954.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:51:51 +0000 (0:00:01.023) 0:02:06.290 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d8c0d3e81\\x2d0b87\\x2d4ffc\\x2d8a51\\x2dd44c5b359954.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:51:52 +0000 (0:00:00.059) 0:02:06.349 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2d8c0d3e81\x2d0b87\x2d4ffc\x2d8a51\x2dd44c5b359954.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8c0d3e81\\x2d0b87\\x2d4ffc\\x2d8a51\\x2dd44c5b359954.service", "name": "systemd-cryptsetup@luks\\x2d8c0d3e81\\x2d0b87\\x2d4ffc\\x2d8a51\\x2dd44c5b359954.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service system-systemd\\x2dcryptsetup.slice systemd-readahead-replay.service dev-nvme1n1.device systemd-journald.socket cryptsetup-pre.target", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-nvme1n1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954 /dev/nvme1n1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-8c0d3e81-0b87-4ffc-8a51-d44c5b359954 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d8c0d3e81\\x2d0b87\\x2d4ffc\\x2d8a51\\x2dd44c5b359954.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d8c0d3e81\\x2d0b87\\x2d4ffc\\x2d8a51\\x2dd44c5b359954.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d8c0d3e81\\x2d0b87\\x2d4ffc\\x2d8a51\\x2dd44c5b359954.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-nvme1n1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:51:52 +0000 (0:00:00.483) 0:02:06.833 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f' in safe mode due to encryption removal TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Thursday 21 July 2022 14:51:53 +0000 (0:00:01.041) 0:02:07.875 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': True, 'pools': [{'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, 'raid_spare_count': None, 'raid_disks': [], 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'partition', 'encryption_cipher': None, 'raid_spare_count': None}], 'volumes': [], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "cannot remove existing formatting on device 'luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f' in safe mode due to encryption removal", '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:51:53 +0000 (0:00:00.045) 0:02:07.920 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2d8c0d3e81\x2d0b87\x2d4ffc\x2d8a51\x2dd44c5b359954.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8c0d3e81\\x2d0b87\\x2d4ffc\\x2d8a51\\x2dd44c5b359954.service", "name": "systemd-cryptsetup@luks\\x2d8c0d3e81\\x2d0b87\\x2d4ffc\\x2d8a51\\x2dd44c5b359954.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d8c0d3e81\\x2d0b87\\x2d4ffc\\x2d8a51\\x2dd44c5b359954.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d8c0d3e81\\x2d0b87\\x2d4ffc\\x2d8a51\\x2dd44c5b359954.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d8c0d3e81\\x2d0b87\\x2d4ffc\\x2d8a51\\x2dd44c5b359954.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:252 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.489) 0:02:08.410 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:258 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.039) 0:02:08.450 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml:10 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.050) 0:02:08.500 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415107.6772952, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658415107.6772952, "dev": 64512, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1658415107.6772952, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744073308306740", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml:15 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.376) 0:02:08.877 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:269 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.091) 0:02:08.968 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.040) 0:02:09.008 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.077) 0:02:09.086 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.431) 0:02:09.518 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.062) 0:02:09.580 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.035) 0:02:09.616 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.036) 0:02:09.653 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.047) 0:02:09.700 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.021) 0:02:09.722 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.702) 0:02:10.425 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.038) 0:02:10.463 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.034) 0:02:10.498 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:51:57 +0000 (0:00:01.026) 0:02:11.525 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.049) 0:02:11.574 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.035) 0:02:11.610 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.040) 0:02:11.650 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.039) 0:02:11.690 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.542) 0:02:12.232 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service": { "name": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:51:58 +0000 (0:00:01.038) 0:02:13.271 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:51:58 +0000 (0:00:00.058) 0:02:13.329 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2db46ee94f\x2db9a4\x2d4717\x2d8f7b\x2d08a03053da3f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "name": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-replay.service systemd-readahead-collect.service dev-nvme1n1p1.device cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice systemd-journald.socket", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-nvme1n1p1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f /dev/nvme1n1p1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "dev-mapper-luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-nvme1n1p1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:51:59 +0000 (0:00:00.533) 0:02:13.863 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1p1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/nvme1n1p1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1p1", "name": "luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme1n1p1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/nvme1n1p1", "_kernel_device": "/dev/nvme1n1p1", "_mount_id": "UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:52:01 +0000 (0:00:01.550) 0:02:15.413 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:52:01 +0000 (0:00:00.037) 0:02:15.450 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2db46ee94f\x2db9a4\x2d4717\x2d8f7b\x2d08a03053da3f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "name": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "dev-mapper-luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-nvme1n1p1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:52:01 +0000 (0:00:00.481) 0:02:15.932 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1p1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/nvme1n1p1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1p1", "name": "luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme1n1p1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/nvme1n1p1", "_kernel_device": "/dev/nvme1n1p1", "_mount_id": "UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:52:01 +0000 (0:00:00.040) 0:02:15.972 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/nvme1n1p1", "_kernel_device": "/dev/nvme1n1p1", "_mount_id": "UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:52:01 +0000 (0:00:00.040) 0:02:16.012 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:52:01 +0000 (0:00:00.040) 0:02:16.053 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:52:02 +0000 (0:00:00.358) 0:02:16.411 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:52:02 +0000 (0:00:00.458) 0:02:16.869 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': 'UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:52:02 +0000 (0:00:00.363) 0:02:17.233 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:52:03 +0000 (0:00:00.452) 0:02:17.686 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415101.6792953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ad38e5447ce4f5eb072ee3a18bc3de7eb62058bf", "ctime": 1658415098.6732953, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 16785288, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1658415098.6722953, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 59, "uid": 0, "version": "18446744072187362517", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:52:03 +0000 (0:00:00.333) 0:02:18.019 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'absent', 'password': '-', 'name': 'luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f', 'backing_device': '/dev/nvme1n1p1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/nvme1n1p1", "name": "luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:52:04 +0000 (0:00:00.352) 0:02:18.372 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:286 Thursday 21 July 2022 14:52:04 +0000 (0:00:00.838) 0:02:19.210 ********* included: /tmp/tmpaxjje44y/tests/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:1 Thursday 21 July 2022 14:52:04 +0000 (0:00:00.036) 0:02:19.247 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/nvme1n1p1", "_kernel_device": "/dev/nvme1n1p1", "_mount_id": "UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:6 Thursday 21 July 2022 14:52:04 +0000 (0:00:00.049) 0:02:19.296 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:14 Thursday 21 July 2022 14:52:04 +0000 (0:00:00.034) 0:02:19.331 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1p1": { "fstype": "xfs", "label": "", "name": "/dev/nvme1n1p1", "size": "10G", "type": "partition", "uuid": "3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-14-49-33-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:19 Thursday 21 July 2022 14:52:05 +0000 (0:00:00.358) 0:02:19.689 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003757", "end": "2022-07-21 10:52:05.646872", "rc": 0, "start": "2022-07-21 10:52:05.643115" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:24 Thursday 21 July 2022 14:52:05 +0000 (0:00:00.310) 0:02:20.000 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003389", "end": "2022-07-21 10:52:05.956078", "failed_when_result": false, "rc": 0, "start": "2022-07-21 10:52:05.952689" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:33 Thursday 21 July 2022 14:52:05 +0000 (0:00:00.311) 0:02:20.311 ********* included: /tmp/tmpaxjje44y/tests/test-verify-pool.yml for /cache/rhel-7.qcow2 => (item={'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/nvme1n1p1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/nvme1n1p1', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': 'UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'partition', 'encryption_cipher': None, 'raid_spare_count': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool.yml:5 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.065) 0:02:20.377 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool.yml:18 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.114) 0:02:20.492 ********* included: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml for /cache/rhel-7.qcow2 => (item=members) included: /tmp/tmpaxjje44y/tests/test-verify-pool-volumes.yml for /cache/rhel-7.qcow2 => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:1 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.046) 0:02:20.538 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:10 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.024) 0:02:20.563 ********* TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:19 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.024) 0:02:20.588 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:23 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.030) 0:02:20.618 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:27 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.027) 0:02:20.646 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:34 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.026) 0:02:20.672 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:38 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.025) 0:02:20.698 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:42 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.027) 0:02:20.726 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:46 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.024) 0:02:20.751 ********* TASK [Check MD RAID] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:56 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.026) 0:02:20.777 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-md.yml for /cache/rhel-7.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:6 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.041) 0:02:20.819 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:12 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.024) 0:02:20.843 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:16 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.024) 0:02:20.867 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:20 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.024) 0:02:20.892 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:24 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.024) 0:02:20.917 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:30 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.025) 0:02:20.943 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:36 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.026) 0:02:20.969 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:44 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.025) 0:02:20.995 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:59 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.038) 0:02:21.033 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-lvmraid.yml for /cache/rhel-7.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.046) 0:02:21.080 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/nvme1n1p1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/nvme1n1p1', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': 'UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/nvme1n1p1", "_kernel_device": "/dev/nvme1n1p1", "_mount_id": "UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:62 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.031) 0:02:21.111 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-thin.yml for /cache/rhel-7.qcow2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-thin.yml:1 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.043) 0:02:21.155 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/nvme1n1p1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/nvme1n1p1', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': 'UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/nvme1n1p1", "_kernel_device": "/dev/nvme1n1p1", "_mount_id": "UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:65 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.030) 0:02:21.185 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml for /cache/rhel-7.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.045) 0:02:21.231 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.053) 0:02:21.284 ********* TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.024) 0:02:21.309 ********* TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 14:52:06 +0000 (0:00:00.022) 0:02:21.331 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:68 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.036) 0:02:21.368 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-vdo.yml for /cache/rhel-7.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.046) 0:02:21.415 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/nvme1n1p1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/nvme1n1p1', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': 'UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/nvme1n1p1", "_kernel_device": "/dev/nvme1n1p1", "_mount_id": "UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:71 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.028) 0:02:21.444 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.036) 0:02:21.480 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/nvme1n1p1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/nvme1n1p1', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': 'UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:2 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.042) 0:02:21.522 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:10 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.096) 0:02:21.618 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:6 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.123) 0:02:21.742 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/nvme1n1p1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:14 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.041) 0:02:21.783 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2610376, "block_size": 4096, "block_total": 2618624, "block_used": 8248, "device": "/dev/nvme1n1p1", "fstype": "xfs", "inode_available": 5242365, "inode_total": 5242368, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10692100096, "size_total": 10725883904, "uuid": "3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2610376, "block_size": 4096, "block_total": 2618624, "block_used": 8248, "device": "/dev/nvme1n1p1", "fstype": "xfs", "inode_available": 5242365, "inode_total": 5242368, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10692100096, "size_total": 10725883904, "uuid": "3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:28 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.053) 0:02:21.837 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:37 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.048) 0:02:21.886 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:45 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.047) 0:02:21.933 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:54 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.049) 0:02:21.983 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:58 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.024) 0:02:22.007 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:63 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.025) 0:02:22.032 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:75 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.025) 0:02:22.058 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.032) 0:02:22.090 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.061) 0:02:22.152 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:32 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.053) 0:02:22.205 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:39 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.050) 0:02:22.256 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:49 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.035) 0:02:22.292 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:4 Thursday 21 July 2022 14:52:07 +0000 (0:00:00.034) 0:02:22.327 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:10 Thursday 21 July 2022 14:52:08 +0000 (0:00:00.040) 0:02:22.367 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:4 Thursday 21 July 2022 14:52:08 +0000 (0:00:00.038) 0:02:22.405 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415121.0192952, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415121.0192952, "dev": 5, "device_type": 66307, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 71727, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1658415121.0192952, "nlink": 1, "path": "/dev/nvme1n1p1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:10 Thursday 21 July 2022 14:52:08 +0000 (0:00:00.328) 0:02:22.734 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:18 Thursday 21 July 2022 14:52:08 +0000 (0:00:00.039) 0:02:22.774 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:24 Thursday 21 July 2022 14:52:08 +0000 (0:00:00.041) 0:02:22.816 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:28 Thursday 21 July 2022 14:52:08 +0000 (0:00:00.037) 0:02:22.853 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:33 Thursday 21 July 2022 14:52:08 +0000 (0:00:00.023) 0:02:22.876 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 14:52:08 +0000 (0:00:00.049) 0:02:22.925 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 14:52:08 +0000 (0:00:00.025) 0:02:22.951 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.545) 0:02:23.496 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.030) 0:02:23.526 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:30 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.025) 0:02:23.552 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:38 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.092) 0:02:23.645 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.025) 0:02:23.670 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:49 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.024) 0:02:23.695 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:55 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.024) 0:02:23.720 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:61 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.026) 0:02:23.747 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.024) 0:02:23.771 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:74 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.085) 0:02:23.856 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:79 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.050) 0:02:23.907 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:85 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.036) 0:02:23.943 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:91 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.038) 0:02:23.982 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:97 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.045) 0:02:24.027 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:7 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.034) 0:02:24.061 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:13 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.039) 0:02:24.101 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:17 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.043) 0:02:24.144 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:21 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.037) 0:02:24.181 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:25 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.035) 0:02:24.216 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:31 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.038) 0:02:24.255 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:37 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.042) 0:02:24.297 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:3 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.035) 0:02:24.333 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:9 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.023) 0:02:24.356 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:15 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.036) 0:02:24.393 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:20 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.043) 0:02:24.436 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:25 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.037) 0:02:24.473 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:28 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.036) 0:02:24.510 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:31 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.034) 0:02:24.545 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:36 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.044) 0:02:24.589 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:39 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.039) 0:02:24.628 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:44 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.037) 0:02:24.666 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:47 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.040) 0:02:24.707 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:50 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.037) 0:02:24.744 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:6 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.022) 0:02:24.767 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:14 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.022) 0:02:24.790 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:17 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.022) 0:02:24.813 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:22 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.023) 0:02:24.836 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:26 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.022) 0:02:24.859 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:32 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.022) 0:02:24.882 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:36 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.022) 0:02:24.904 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:16 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.024) 0:02:24.928 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:43 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.032) 0:02:24.961 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:53 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.020) 0:02:24.981 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmpaxjje44y/tests/create-test-file.yml:10 Thursday 21 July 2022 14:52:10 +0000 (0:00:00.034) 0:02:25.016 ********* changed: [/cache/rhel-7.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:292 Thursday 21 July 2022 14:52:11 +0000 (0:00:00.339) 0:02:25.356 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:52:11 +0000 (0:00:00.039) 0:02:25.395 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:52:11 +0000 (0:00:00.036) 0:02:25.432 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:52:11 +0000 (0:00:00.425) 0:02:25.857 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:52:11 +0000 (0:00:00.060) 0:02:25.918 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:52:11 +0000 (0:00:00.033) 0:02:25.951 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:52:11 +0000 (0:00:00.032) 0:02:25.984 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:52:11 +0000 (0:00:00.045) 0:02:26.029 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:52:11 +0000 (0:00:00.020) 0:02:26.049 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:52:12 +0000 (0:00:00.701) 0:02:26.751 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:52:12 +0000 (0:00:00.042) 0:02:26.794 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:52:12 +0000 (0:00:00.044) 0:02:26.838 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:52:13 +0000 (0:00:00.973) 0:02:27.812 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:52:13 +0000 (0:00:00.049) 0:02:27.861 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:52:13 +0000 (0:00:00.039) 0:02:27.901 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:52:13 +0000 (0:00:00.039) 0:02:27.940 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:52:13 +0000 (0:00:00.036) 0:02:27.976 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:52:14 +0000 (0:00:00.547) 0:02:28.524 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service": { "name": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:52:15 +0000 (0:00:01.051) 0:02:29.575 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:52:15 +0000 (0:00:00.057) 0:02:29.632 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2db46ee94f\x2db9a4\x2d4717\x2d8f7b\x2d08a03053da3f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "name": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice dev-nvme1n1p1.device systemd-readahead-collect.service systemd-journald.socket cryptsetup-pre.target systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-nvme1n1p1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f /dev/nvme1n1p1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-b46ee94f-b9a4-4717-8f7b-08a03053da3f ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-nvme1n1p1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:52:15 +0000 (0:00:00.481) 0:02:30.114 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'nvme1n1p1' in safe mode due to adding encryption TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Thursday 21 July 2022 14:52:16 +0000 (0:00:01.050) 0:02:31.164 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': True, 'pools': [{'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, 'raid_spare_count': None, 'raid_disks': [], 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'partition', 'encryption_cipher': None, 'raid_spare_count': None}], 'volumes': [], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "cannot remove existing formatting on device 'nvme1n1p1' in safe mode due to adding encryption", '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:52:16 +0000 (0:00:00.077) 0:02:31.241 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2db46ee94f\x2db9a4\x2d4717\x2d8f7b\x2d08a03053da3f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "name": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2db46ee94f\\x2db9a4\\x2d4717\\x2d8f7b\\x2d08a03053da3f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:313 Thursday 21 July 2022 14:52:17 +0000 (0:00:00.520) 0:02:31.763 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:319 Thursday 21 July 2022 14:52:17 +0000 (0:00:00.077) 0:02:31.840 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml:10 Thursday 21 July 2022 14:52:17 +0000 (0:00:00.125) 0:02:31.965 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415130.9932952, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658415130.9932952, "dev": 66307, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1658415130.9932952, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "439224015", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml:15 Thursday 21 July 2022 14:52:17 +0000 (0:00:00.325) 0:02:32.291 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:332 Thursday 21 July 2022 14:52:17 +0000 (0:00:00.040) 0:02:32.331 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testgYdj8nlukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:339 Thursday 21 July 2022 14:52:18 +0000 (0:00:00.409) 0:02:32.740 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testgYdj8nlukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1658415138.4582446-191193-87441375190718/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:346 Thursday 21 July 2022 14:52:19 +0000 (0:00:00.776) 0:02:33.516 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:52:19 +0000 (0:00:00.053) 0:02:33.570 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:52:19 +0000 (0:00:00.033) 0:02:33.604 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:52:19 +0000 (0:00:00.425) 0:02:34.029 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:52:19 +0000 (0:00:00.064) 0:02:34.093 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:52:19 +0000 (0:00:00.037) 0:02:34.131 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:52:19 +0000 (0:00:00.033) 0:02:34.165 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:52:19 +0000 (0:00:00.045) 0:02:34.211 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:52:19 +0000 (0:00:00.020) 0:02:34.231 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:52:20 +0000 (0:00:00.699) 0:02:34.931 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testgYdj8nlukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:52:20 +0000 (0:00:00.040) 0:02:34.971 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:52:20 +0000 (0:00:00.038) 0:02:35.009 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:52:21 +0000 (0:00:00.959) 0:02:35.968 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:52:21 +0000 (0:00:00.052) 0:02:36.021 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:52:21 +0000 (0:00:00.036) 0:02:36.058 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:52:21 +0000 (0:00:00.041) 0:02:36.100 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:52:21 +0000 (0:00:00.037) 0:02:36.137 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:52:22 +0000 (0:00:00.605) 0:02:36.742 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:52:23 +0000 (0:00:01.088) 0:02:37.831 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:52:23 +0000 (0:00:00.056) 0:02:37.887 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:52:23 +0000 (0:00:00.020) 0:02:37.908 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/nvme1n1p1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/nvme1n1p1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1p1", "name": "luks-4635b4bf-00eb-416a-96e6-73bd53379745", "password": "/tmp/storage_testgYdj8nlukskey", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testgYdj8nlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:52:31 +0000 (0:00:07.563) 0:02:45.472 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.036) 0:02:45.509 ********* TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.021) 0:02:45.530 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/nvme1n1p1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/nvme1n1p1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1p1", "name": "luks-4635b4bf-00eb-416a-96e6-73bd53379745", "password": "/tmp/storage_testgYdj8nlukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testgYdj8nlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.042) 0:02:45.573 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testgYdj8nlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.042) 0:02:45.616 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.037) 0:02:45.653 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': 'UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=3faa4fdf-50cf-46d7-8d8c-fd7fd6acb1cf" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.347) 0:02:46.001 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:52:32 +0000 (0:00:00.508) 0:02:46.509 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:52:32 +0000 (0:00:00.362) 0:02:46.872 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:52:32 +0000 (0:00:00.455) 0:02:47.328 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415125.9542952, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658415124.0072951, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 12585524, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658415124.0062952, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1833430532", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:52:33 +0000 (0:00:00.322) 0:02:47.650 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'present', 'password': '/tmp/storage_testgYdj8nlukskey', 'name': 'luks-4635b4bf-00eb-416a-96e6-73bd53379745', 'backing_device': '/dev/nvme1n1p1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/nvme1n1p1", "name": "luks-4635b4bf-00eb-416a-96e6-73bd53379745", "password": "/tmp/storage_testgYdj8nlukskey", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:52:33 +0000 (0:00:00.380) 0:02:48.031 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:363 Thursday 21 July 2022 14:52:34 +0000 (0:00:00.924) 0:02:48.956 ********* included: /tmp/tmpaxjje44y/tests/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:1 Thursday 21 July 2022 14:52:34 +0000 (0:00:00.036) 0:02:48.992 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testgYdj8nlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:6 Thursday 21 July 2022 14:52:34 +0000 (0:00:00.051) 0:02:49.044 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:14 Thursday 21 July 2022 14:52:34 +0000 (0:00:00.039) 0:02:49.083 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "size": "10G", "type": "crypt", "uuid": "4cddd72a-de32-4330-b92f-b7fc3daba437" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1p1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/nvme1n1p1", "size": "10G", "type": "partition", "uuid": "4635b4bf-00eb-416a-96e6-73bd53379745" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-14-49-33-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:19 Thursday 21 July 2022 14:52:35 +0000 (0:00:00.312) 0:02:49.396 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003456", "end": "2022-07-21 10:52:35.342627", "rc": 0, "start": "2022-07-21 10:52:35.339171" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 /dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:24 Thursday 21 July 2022 14:52:35 +0000 (0:00:00.300) 0:02:49.696 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003389", "end": "2022-07-21 10:52:35.630035", "failed_when_result": false, "rc": 0, "start": "2022-07-21 10:52:35.626646" } STDOUT: luks-4635b4bf-00eb-416a-96e6-73bd53379745 /dev/nvme1n1p1 /tmp/storage_testgYdj8nlukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:33 Thursday 21 July 2022 14:52:35 +0000 (0:00:00.306) 0:02:50.002 ********* included: /tmp/tmpaxjje44y/tests/test-verify-pool.yml for /cache/rhel-7.qcow2 => (item={'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testgYdj8nlukskey', 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'partition', 'encryption_cipher': None, 'raid_spare_count': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool.yml:5 Thursday 21 July 2022 14:52:35 +0000 (0:00:00.057) 0:02:50.060 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool.yml:18 Thursday 21 July 2022 14:52:35 +0000 (0:00:00.032) 0:02:50.093 ********* included: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml for /cache/rhel-7.qcow2 => (item=members) included: /tmp/tmpaxjje44y/tests/test-verify-pool-volumes.yml for /cache/rhel-7.qcow2 => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:1 Thursday 21 July 2022 14:52:35 +0000 (0:00:00.044) 0:02:50.137 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:10 Thursday 21 July 2022 14:52:35 +0000 (0:00:00.021) 0:02:50.159 ********* TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:19 Thursday 21 July 2022 14:52:35 +0000 (0:00:00.019) 0:02:50.179 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:23 Thursday 21 July 2022 14:52:35 +0000 (0:00:00.021) 0:02:50.200 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:27 Thursday 21 July 2022 14:52:35 +0000 (0:00:00.022) 0:02:50.222 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:34 Thursday 21 July 2022 14:52:35 +0000 (0:00:00.021) 0:02:50.244 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:38 Thursday 21 July 2022 14:52:35 +0000 (0:00:00.021) 0:02:50.265 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:42 Thursday 21 July 2022 14:52:35 +0000 (0:00:00.021) 0:02:50.286 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:46 Thursday 21 July 2022 14:52:35 +0000 (0:00:00.022) 0:02:50.309 ********* TASK [Check MD RAID] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:56 Thursday 21 July 2022 14:52:35 +0000 (0:00:00.019) 0:02:50.328 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-md.yml for /cache/rhel-7.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:6 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.038) 0:02:50.367 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:12 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.025) 0:02:50.392 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:16 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.023) 0:02:50.416 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:20 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.021) 0:02:50.437 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:24 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.021) 0:02:50.459 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:30 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.023) 0:02:50.482 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:36 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.025) 0:02:50.507 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:44 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.022) 0:02:50.530 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:59 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.079) 0:02:50.609 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-lvmraid.yml for /cache/rhel-7.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.081) 0:02:50.691 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testgYdj8nlukskey', 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testgYdj8nlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:62 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.032) 0:02:50.724 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-thin.yml for /cache/rhel-7.qcow2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-thin.yml:1 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.047) 0:02:50.771 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testgYdj8nlukskey', 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testgYdj8nlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:65 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.028) 0:02:50.800 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml for /cache/rhel-7.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.048) 0:02:50.848 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.052) 0:02:50.900 ********* TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.023) 0:02:50.924 ********* TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.020) 0:02:50.945 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:68 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.036) 0:02:50.982 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-vdo.yml for /cache/rhel-7.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.045) 0:02:51.027 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testgYdj8nlukskey', 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testgYdj8nlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:71 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.027) 0:02:51.055 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.034) 0:02:51.089 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testgYdj8nlukskey', 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:2 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.039) 0:02:51.128 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:10 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.047) 0:02:51.176 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:6 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.074) 0:02:51.250 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:14 Thursday 21 July 2022 14:52:36 +0000 (0:00:00.037) 0:02:51.288 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2609864, "block_size": 4096, "block_total": 2618112, "block_used": 8248, "device": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "fstype": "xfs", "inode_available": 5241341, "inode_total": 5241344, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10690002944, "size_total": 10723786752, "uuid": "4cddd72a-de32-4330-b92f-b7fc3daba437" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2609864, "block_size": 4096, "block_total": 2618112, "block_used": 8248, "device": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "fstype": "xfs", "inode_available": 5241341, "inode_total": 5241344, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10690002944, "size_total": 10723786752, "uuid": "4cddd72a-de32-4330-b92f-b7fc3daba437" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:28 Thursday 21 July 2022 14:52:37 +0000 (0:00:00.052) 0:02:51.340 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:37 Thursday 21 July 2022 14:52:37 +0000 (0:00:00.048) 0:02:51.389 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:45 Thursday 21 July 2022 14:52:37 +0000 (0:00:00.045) 0:02:51.434 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:54 Thursday 21 July 2022 14:52:37 +0000 (0:00:00.052) 0:02:51.487 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:58 Thursday 21 July 2022 14:52:37 +0000 (0:00:00.023) 0:02:51.510 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:63 Thursday 21 July 2022 14:52:37 +0000 (0:00:00.023) 0:02:51.534 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:75 Thursday 21 July 2022 14:52:37 +0000 (0:00:00.024) 0:02:51.558 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 14:52:37 +0000 (0:00:00.033) 0:02:51.592 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 14:52:37 +0000 (0:00:00.059) 0:02:51.651 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:32 Thursday 21 July 2022 14:52:37 +0000 (0:00:00.045) 0:02:51.697 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:39 Thursday 21 July 2022 14:52:37 +0000 (0:00:00.051) 0:02:51.748 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:49 Thursday 21 July 2022 14:52:37 +0000 (0:00:00.039) 0:02:51.787 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:4 Thursday 21 July 2022 14:52:37 +0000 (0:00:00.038) 0:02:51.826 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:10 Thursday 21 July 2022 14:52:37 +0000 (0:00:00.047) 0:02:51.873 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:4 Thursday 21 July 2022 14:52:37 +0000 (0:00:00.094) 0:02:51.967 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415150.9262953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415150.9262953, "dev": 5, "device_type": 66307, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 77824, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1658415150.9262953, "nlink": 1, "path": "/dev/nvme1n1p1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:10 Thursday 21 July 2022 14:52:37 +0000 (0:00:00.357) 0:02:52.325 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:18 Thursday 21 July 2022 14:52:38 +0000 (0:00:00.039) 0:02:52.364 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:24 Thursday 21 July 2022 14:52:38 +0000 (0:00:00.039) 0:02:52.404 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:28 Thursday 21 July 2022 14:52:38 +0000 (0:00:00.040) 0:02:52.445 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:33 Thursday 21 July 2022 14:52:38 +0000 (0:00:00.027) 0:02:52.472 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 14:52:38 +0000 (0:00:00.038) 0:02:52.510 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415151.0612953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415151.0612953, "dev": 5, "device_type": 64512, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 79876, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658415151.0612953, "nlink": 1, "path": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 14:52:38 +0000 (0:00:00.311) 0:02:52.821 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 14:52:39 +0000 (0:00:00.529) 0:02:53.351 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/nvme1n1p1" ], "delta": "0:00:00.037340", "end": "2022-07-21 10:52:39.344369", "rc": 0, "start": "2022-07-21 10:52:39.307029" } STDOUT: LUKS header information for /dev/nvme1n1p1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 3f 08 57 04 52 34 f6 5e a6 ad 57 f0 83 2e a1 42 f4 ac 22 0c MK salt: 96 8c 05 49 ab f6 3b 3b 16 43 e3 c7 39 7b cc 84 0b 58 fe b3 9c 01 b2 54 05 e5 e1 a8 b9 b6 47 0f MK iterations: 22692 UUID: 4635b4bf-00eb-416a-96e6-73bd53379745 Key Slot 0: ENABLED Iterations: 363080 Salt: 33 87 91 66 d1 c5 98 93 90 ee d7 38 c9 79 f0 42 d8 30 85 53 fc 77 56 c1 d7 b5 22 f2 e2 ce aa 0e Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 14:52:39 +0000 (0:00:00.352) 0:02:53.703 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:30 Thursday 21 July 2022 14:52:39 +0000 (0:00:00.041) 0:02:53.744 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:38 Thursday 21 July 2022 14:52:39 +0000 (0:00:00.052) 0:02:53.797 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 14:52:39 +0000 (0:00:00.039) 0:02:53.837 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:49 Thursday 21 July 2022 14:52:39 +0000 (0:00:00.041) 0:02:53.878 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:55 Thursday 21 July 2022 14:52:39 +0000 (0:00:00.026) 0:02:53.905 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:61 Thursday 21 July 2022 14:52:39 +0000 (0:00:00.024) 0:02:53.929 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 14:52:39 +0000 (0:00:00.024) 0:02:53.953 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-4635b4bf-00eb-416a-96e6-73bd53379745 /dev/nvme1n1p1 /tmp/storage_testgYdj8nlukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_testgYdj8nlukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:74 Thursday 21 July 2022 14:52:39 +0000 (0:00:00.055) 0:02:54.009 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:79 Thursday 21 July 2022 14:52:39 +0000 (0:00:00.052) 0:02:54.061 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:85 Thursday 21 July 2022 14:52:39 +0000 (0:00:00.054) 0:02:54.115 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:91 Thursday 21 July 2022 14:52:39 +0000 (0:00:00.052) 0:02:54.168 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:97 Thursday 21 July 2022 14:52:39 +0000 (0:00:00.053) 0:02:54.222 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:7 Thursday 21 July 2022 14:52:39 +0000 (0:00:00.037) 0:02:54.259 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:13 Thursday 21 July 2022 14:52:39 +0000 (0:00:00.036) 0:02:54.295 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:17 Thursday 21 July 2022 14:52:39 +0000 (0:00:00.037) 0:02:54.333 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:21 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.040) 0:02:54.374 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:25 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.039) 0:02:54.413 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:31 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.037) 0:02:54.450 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:37 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.035) 0:02:54.485 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:3 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.040) 0:02:54.526 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:9 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.025) 0:02:54.552 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:15 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.036) 0:02:54.589 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:20 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.037) 0:02:54.626 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:25 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.037) 0:02:54.664 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:28 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.036) 0:02:54.700 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:31 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.040) 0:02:54.740 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:36 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.040) 0:02:54.781 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:39 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.040) 0:02:54.821 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:44 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.039) 0:02:54.860 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:47 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.084) 0:02:54.944 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:50 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.041) 0:02:54.986 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:6 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.024) 0:02:55.010 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:14 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.022) 0:02:55.033 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:17 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.023) 0:02:55.057 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:22 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.025) 0:02:55.083 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:26 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.055) 0:02:55.138 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:32 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.023) 0:02:55.162 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:36 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.022) 0:02:55.184 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:16 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.024) 0:02:55.209 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:43 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.038) 0:02:55.248 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:53 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.021) 0:02:55.269 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:365 Thursday 21 July 2022 14:52:40 +0000 (0:00:00.034) 0:02:55.304 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "path": "/tmp/storage_testgYdj8nlukskey", "state": "absent" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:377 Thursday 21 July 2022 14:52:41 +0000 (0:00:00.316) 0:02:55.621 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:52:41 +0000 (0:00:00.040) 0:02:55.661 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:52:41 +0000 (0:00:00.034) 0:02:55.695 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:52:41 +0000 (0:00:00.421) 0:02:56.117 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:52:41 +0000 (0:00:00.064) 0:02:56.182 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:52:41 +0000 (0:00:00.039) 0:02:56.222 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:52:41 +0000 (0:00:00.035) 0:02:56.257 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:52:41 +0000 (0:00:00.051) 0:02:56.309 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:52:41 +0000 (0:00:00.023) 0:02:56.332 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:52:42 +0000 (0:00:00.677) 0:02:57.009 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:52:42 +0000 (0:00:00.038) 0:02:57.048 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:52:42 +0000 (0:00:00.039) 0:02:57.087 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:52:43 +0000 (0:00:01.014) 0:02:58.102 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:52:43 +0000 (0:00:00.047) 0:02:58.149 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:52:43 +0000 (0:00:00.035) 0:02:58.185 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:52:43 +0000 (0:00:00.038) 0:02:58.224 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:52:43 +0000 (0:00:00.032) 0:02:58.257 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed" ] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:52:44 +0000 (0:00:00.534) 0:02:58.791 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:52:45 +0000 (0:00:01.069) 0:02:59.860 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:52:45 +0000 (0:00:00.159) 0:03:00.020 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:52:45 +0000 (0:00:00.022) 0:03:00.042 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Thursday 21 July 2022 14:52:46 +0000 (0:00:01.091) 0:03:01.134 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': False, 'pools': [{'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, 'raid_spare_count': None, 'raid_disks': [], 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'lvm', 'encryption_cipher': None, 'raid_spare_count': None}], 'volumes': [], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "encrypted volume 'test1' missing key/password", '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:52:46 +0000 (0:00:00.039) 0:03:01.174 ********* TASK [Check that we failed in the role] **************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:397 Thursday 21 July 2022 14:52:46 +0000 (0:00:00.022) 0:03:01.196 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the keyless luks test] ****************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:403 Thursday 21 July 2022 14:52:46 +0000 (0:00:00.040) 0:03:01.237 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:410 Thursday 21 July 2022 14:52:46 +0000 (0:00:00.053) 0:03:01.290 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:52:46 +0000 (0:00:00.037) 0:03:01.328 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:52:47 +0000 (0:00:00.034) 0:03:01.362 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:52:47 +0000 (0:00:00.421) 0:03:01.784 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:52:47 +0000 (0:00:00.072) 0:03:01.856 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:52:47 +0000 (0:00:00.032) 0:03:01.888 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:52:47 +0000 (0:00:00.032) 0:03:01.921 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:52:47 +0000 (0:00:00.046) 0:03:01.967 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:52:47 +0000 (0:00:00.021) 0:03:01.989 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:52:48 +0000 (0:00:00.650) 0:03:02.639 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:52:48 +0000 (0:00:00.039) 0:03:02.679 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:52:48 +0000 (0:00:00.040) 0:03:02.719 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:52:49 +0000 (0:00:01.046) 0:03:03.766 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:52:49 +0000 (0:00:00.048) 0:03:03.814 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:52:49 +0000 (0:00:00.037) 0:03:03.851 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:52:49 +0000 (0:00:00.039) 0:03:03.891 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:52:49 +0000 (0:00:00.036) 0:03:03.927 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed" ] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:52:50 +0000 (0:00:00.585) 0:03:04.512 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:52:51 +0000 (0:00:01.011) 0:03:05.524 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:52:51 +0000 (0:00:00.058) 0:03:05.582 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:52:51 +0000 (0:00:00.021) 0:03:05.604 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1p1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/nvme1n1p1", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1p1", "name": "luks-4635b4bf-00eb-416a-96e6-73bd53379745", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:52:59 +0000 (0:00:08.203) 0:03:13.808 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:52:59 +0000 (0:00:00.040) 0:03:13.849 ********* TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:52:59 +0000 (0:00:00.023) 0:03:13.872 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1p1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/nvme1n1p1", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1p1", "name": "luks-4635b4bf-00eb-416a-96e6-73bd53379745", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:52:59 +0000 (0:00:00.040) 0:03:13.913 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:52:59 +0000 (0:00:00.037) 0:03:13.951 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:52:59 +0000 (0:00:00.038) 0:03:13.989 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-4635b4bf-00eb-416a-96e6-73bd53379745" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:53:00 +0000 (0:00:00.353) 0:03:14.343 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:53:00 +0000 (0:00:00.484) 0:03:14.827 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:53:00 +0000 (0:00:00.375) 0:03:15.203 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:53:01 +0000 (0:00:00.458) 0:03:15.661 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415155.6282952, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "86851451d60b3fa2a4baab2fadb4d41be196b318", "ctime": 1658415153.6662953, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 20976658, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1658415153.6652951, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 88, "uid": 0, "version": "18446744072417726693", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:53:01 +0000 (0:00:00.362) 0:03:16.024 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'absent', 'password': '-', 'name': 'luks-4635b4bf-00eb-416a-96e6-73bd53379745', 'backing_device': '/dev/nvme1n1p1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/nvme1n1p1", "name": "luks-4635b4bf-00eb-416a-96e6-73bd53379745", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [/cache/rhel-7.qcow2] => (item={'state': 'present', 'password': '-', 'name': 'luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'backing_device': '/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.676) 0:03:16.700 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:429 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.918) 0:03:17.618 ********* included: /tmp/tmpaxjje44y/tests/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:1 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.077) 0:03:17.696 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:6 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.049) 0:03:17.745 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:14 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.037) 0:03:17.783 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "a7b71ad5-d58c-4b30-bb37-33996c066d5e" }, "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "size": "4G", "type": "crypt", "uuid": "b9104d32-76dc-43d2-bef2-50a0546efd82" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "LVM2_member", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "vTKkPp-QTj5-ppfD-mClE-mRD0-D6Qv-qDjyQk" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-14-49-33-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:19 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.328) 0:03:18.111 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003566", "end": "2022-07-21 10:53:04.069018", "rc": 0, "start": "2022-07-21 10:53:04.065452" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 /dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:24 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.315) 0:03:18.426 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003505", "end": "2022-07-21 10:53:04.381411", "failed_when_result": false, "rc": 0, "start": "2022-07-21 10:53:04.377906" } STDOUT: luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:33 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.312) 0:03:18.739 ********* included: /tmp/tmpaxjje44y/tests/test-verify-pool.yml for /cache/rhel-7.qcow2 => (item={'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 512, 'encryption_cipher': 'serpent-xts-plain64', 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'lvm', 'encryption_cipher': None, 'raid_spare_count': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool.yml:5 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.061) 0:03:18.800 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool.yml:18 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.039) 0:03:18.839 ********* included: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml for /cache/rhel-7.qcow2 => (item=members) included: /tmp/tmpaxjje44y/tests/test-verify-pool-volumes.yml for /cache/rhel-7.qcow2 => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:1 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.047) 0:03:18.886 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/nvme1n1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:10 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.058) 0:03:18.945 ********* ok: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/nvme1n1", "pv": "/dev/nvme1n1" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:19 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.456) 0:03:19.401 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:23 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.063) 0:03:19.465 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/nvme1n1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:27 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.052) 0:03:19.517 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:34 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.051) 0:03:19.569 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:38 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.040) 0:03:19.610 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:42 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.051) 0:03:19.661 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:46 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.023) 0:03:19.685 ********* ok: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/nvme1n1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:56 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.041) 0:03:19.727 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-md.yml for /cache/rhel-7.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:6 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.045) 0:03:19.772 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:12 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.024) 0:03:19.796 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:16 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.022) 0:03:19.819 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:20 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.022) 0:03:19.842 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:24 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.021) 0:03:19.863 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:30 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.021) 0:03:19.885 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:36 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.021) 0:03:19.906 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:44 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.022) 0:03:19.928 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:59 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.030) 0:03:19.959 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-lvmraid.yml for /cache/rhel-7.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.041) 0:03:20.000 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-member-lvmraid.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 512, 'encryption_cipher': 'serpent-xts-plain64', 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.076) 0:03:20.077 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.027) 0:03:20.105 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.026) 0:03:20.132 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:62 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.027) 0:03:20.159 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-thin.yml for /cache/rhel-7.qcow2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-thin.yml:1 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.042) 0:03:20.202 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 512, 'encryption_cipher': 'serpent-xts-plain64', 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml:3 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.045) 0:03:20.247 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml:8 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.023) 0:03:20.271 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml:13 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.024) 0:03:20.296 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml:17 Thursday 21 July 2022 14:53:05 +0000 (0:00:00.023) 0:03:20.320 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:65 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.024) 0:03:20.344 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml for /cache/rhel-7.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.045) 0:03:20.390 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.054) 0:03:20.444 ********* skipping: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "_storage_test_pool_member_path": "/dev/nvme1n1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.028) 0:03:20.472 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml for /cache/rhel-7.qcow2 => (item=/dev/nvme1n1) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.044) 0:03:20.517 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.055) 0:03:20.572 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.053) 0:03:20.626 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.041) 0:03:20.668 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.040) 0:03:20.708 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.040) 0:03:20.748 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.035) 0:03:20.783 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:68 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.034) 0:03:20.818 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-vdo.yml for /cache/rhel-7.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.048) 0:03:20.866 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 512, 'encryption_cipher': 'serpent-xts-plain64', 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.044) 0:03:20.911 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.024) 0:03:20.935 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.022) 0:03:20.958 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.022) 0:03:20.980 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.022) 0:03:21.003 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.023) 0:03:21.027 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.025) 0:03:21.052 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.024) 0:03:21.076 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:71 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.035) 0:03:21.112 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.036) 0:03:21.148 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 512, 'encryption_cipher': 'serpent-xts-plain64', 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:2 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.042) 0:03:21.191 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:10 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.154) 0:03:21.345 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:6 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.079) 0:03:21.425 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:14 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.039) 0:03:21.464 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1037256, "block_size": 4096, "block_total": 1045504, "block_used": 8248, "device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 4248600576, "size_total": 4282384384, "uuid": "b9104d32-76dc-43d2-bef2-50a0546efd82" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1037256, "block_size": 4096, "block_total": 1045504, "block_used": 8248, "device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 4248600576, "size_total": 4282384384, "uuid": "b9104d32-76dc-43d2-bef2-50a0546efd82" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:28 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.055) 0:03:21.520 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:37 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.050) 0:03:21.570 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:45 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.047) 0:03:21.618 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:54 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.055) 0:03:21.673 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:58 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.025) 0:03:21.699 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:63 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.025) 0:03:21.725 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:75 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.023) 0:03:21.749 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.037) 0:03:21.786 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.066) 0:03:21.853 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:32 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.049) 0:03:21.903 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:39 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.050) 0:03:21.953 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:49 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.038) 0:03:21.991 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:4 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.037) 0:03:22.029 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:10 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.039) 0:03:22.068 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:4 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.042) 0:03:22.111 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415179.2522953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415179.2522953, "dev": 5, "device_type": 64512, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 88432, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658415179.2522953, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:10 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.334) 0:03:22.445 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:18 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.050) 0:03:22.495 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:24 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.037) 0:03:22.533 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:28 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.037) 0:03:22.570 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:33 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.026) 0:03:22.597 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.041) 0:03:22.639 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415179.3992953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415179.3992953, "dev": 5, "device_type": 64513, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 85748, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658415179.3992953, "nlink": 1, "path": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.329) 0:03:22.968 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 14:53:09 +0000 (0:00:00.547) 0:03:23.516 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.036095", "end": "2022-07-21 10:53:09.535685", "rc": 0, "start": "2022-07-21 10:53:09.499590" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: serpent Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 32 bc 7e 80 af 60 a6 cf 4b 03 cc 26 43 40 8b e4 81 7a 45 af MK salt: 30 f5 70 e8 2b e0 7a b4 57 f9 be 52 9f e5 a7 03 de ad ca ba 1d f3 5f b2 fb 89 26 c4 fa 09 5e 42 MK iterations: 22755 UUID: a7b71ad5-d58c-4b30-bb37-33996c066d5e Key Slot 0: ENABLED Iterations: 365612 Salt: dc 95 6b d7 d4 a7 bd b9 44 63 3a 78 a5 e0 a3 22 95 cf 5f 05 e7 97 3a 5d 10 9e 6e cc 9c d7 50 ff Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 14:53:09 +0000 (0:00:00.381) 0:03:23.897 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:30 Thursday 21 July 2022 14:53:09 +0000 (0:00:00.083) 0:03:23.981 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:38 Thursday 21 July 2022 14:53:09 +0000 (0:00:00.095) 0:03:24.076 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 14:53:09 +0000 (0:00:00.088) 0:03:24.165 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:49 Thursday 21 July 2022 14:53:09 +0000 (0:00:00.083) 0:03:24.249 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:55 Thursday 21 July 2022 14:53:09 +0000 (0:00:00.051) 0:03:24.301 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:61 Thursday 21 July 2022 14:53:10 +0000 (0:00:00.052) 0:03:24.353 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 14:53:10 +0000 (0:00:00.050) 0:03:24.404 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:74 Thursday 21 July 2022 14:53:10 +0000 (0:00:00.055) 0:03:24.459 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:79 Thursday 21 July 2022 14:53:10 +0000 (0:00:00.049) 0:03:24.509 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:85 Thursday 21 July 2022 14:53:10 +0000 (0:00:00.049) 0:03:24.558 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:91 Thursday 21 July 2022 14:53:10 +0000 (0:00:00.051) 0:03:24.610 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:97 Thursday 21 July 2022 14:53:10 +0000 (0:00:00.051) 0:03:24.661 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:7 Thursday 21 July 2022 14:53:10 +0000 (0:00:00.034) 0:03:24.696 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:13 Thursday 21 July 2022 14:53:10 +0000 (0:00:00.037) 0:03:24.733 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:17 Thursday 21 July 2022 14:53:10 +0000 (0:00:00.037) 0:03:24.770 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:21 Thursday 21 July 2022 14:53:10 +0000 (0:00:00.037) 0:03:24.808 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:25 Thursday 21 July 2022 14:53:10 +0000 (0:00:00.039) 0:03:24.848 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:31 Thursday 21 July 2022 14:53:10 +0000 (0:00:00.038) 0:03:24.886 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:37 Thursday 21 July 2022 14:53:10 +0000 (0:00:00.036) 0:03:24.922 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:3 Thursday 21 July 2022 14:53:10 +0000 (0:00:00.036) 0:03:24.959 ********* ok: [/cache/rhel-7.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:9 Thursday 21 July 2022 14:53:11 +0000 (0:00:00.460) 0:03:25.419 ********* ok: [/cache/rhel-7.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:15 Thursday 21 July 2022 14:53:11 +0000 (0:00:00.323) 0:03:25.742 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:20 Thursday 21 July 2022 14:53:11 +0000 (0:00:00.049) 0:03:25.792 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:25 Thursday 21 July 2022 14:53:11 +0000 (0:00:00.037) 0:03:25.829 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:28 Thursday 21 July 2022 14:53:11 +0000 (0:00:00.038) 0:03:25.868 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:31 Thursday 21 July 2022 14:53:11 +0000 (0:00:00.038) 0:03:25.906 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:36 Thursday 21 July 2022 14:53:11 +0000 (0:00:00.041) 0:03:25.948 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:39 Thursday 21 July 2022 14:53:11 +0000 (0:00:00.039) 0:03:25.987 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:44 Thursday 21 July 2022 14:53:11 +0000 (0:00:00.038) 0:03:26.025 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:47 Thursday 21 July 2022 14:53:11 +0000 (0:00:00.037) 0:03:26.062 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:50 Thursday 21 July 2022 14:53:11 +0000 (0:00:00.039) 0:03:26.102 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:6 Thursday 21 July 2022 14:53:11 +0000 (0:00:00.055) 0:03:26.158 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.026031", "end": "2022-07-21 10:53:12.145688", "rc": 0, "start": "2022-07-21 10:53:12.119657" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:14 Thursday 21 July 2022 14:53:12 +0000 (0:00:00.347) 0:03:26.505 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:17 Thursday 21 July 2022 14:53:12 +0000 (0:00:00.055) 0:03:26.560 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:22 Thursday 21 July 2022 14:53:12 +0000 (0:00:00.052) 0:03:26.613 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:26 Thursday 21 July 2022 14:53:12 +0000 (0:00:00.038) 0:03:26.651 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:32 Thursday 21 July 2022 14:53:12 +0000 (0:00:00.037) 0:03:26.689 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:36 Thursday 21 July 2022 14:53:12 +0000 (0:00:00.039) 0:03:26.728 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:16 Thursday 21 July 2022 14:53:12 +0000 (0:00:00.038) 0:03:26.766 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:43 Thursday 21 July 2022 14:53:12 +0000 (0:00:00.037) 0:03:26.803 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:53 Thursday 21 July 2022 14:53:12 +0000 (0:00:00.022) 0:03:26.826 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:431 Thursday 21 July 2022 14:53:12 +0000 (0:00:00.037) 0:03:26.864 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:53:12 +0000 (0:00:00.054) 0:03:26.918 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:53:12 +0000 (0:00:00.086) 0:03:27.005 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:53:13 +0000 (0:00:00.416) 0:03:27.422 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:53:13 +0000 (0:00:00.063) 0:03:27.485 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:53:13 +0000 (0:00:00.035) 0:03:27.520 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:53:13 +0000 (0:00:00.037) 0:03:27.558 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:53:13 +0000 (0:00:00.049) 0:03:27.608 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:53:13 +0000 (0:00:00.021) 0:03:27.629 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:53:13 +0000 (0:00:00.689) 0:03:28.319 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:53:14 +0000 (0:00:00.041) 0:03:28.360 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:53:14 +0000 (0:00:00.038) 0:03:28.399 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:53:15 +0000 (0:00:01.147) 0:03:29.546 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:53:15 +0000 (0:00:00.046) 0:03:29.592 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:53:15 +0000 (0:00:00.037) 0:03:29.630 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:53:15 +0000 (0:00:00.041) 0:03:29.671 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:53:15 +0000 (0:00:00.034) 0:03:29.705 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed" ] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:53:15 +0000 (0:00:00.534) 0:03:30.239 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@259:1.service": { "name": "lvm2-pvscan@259:1.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d4635b4bf\\x2d00eb\\x2d416a\\x2d96e6\\x2d73bd53379745.service": { "name": "systemd-cryptsetup@luks\\x2d4635b4bf\\x2d00eb\\x2d416a\\x2d96e6\\x2d73bd53379745.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:53:16 +0000 (0:00:01.006) 0:03:31.246 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d4635b4bf\\x2d00eb\\x2d416a\\x2d96e6\\x2d73bd53379745.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:53:17 +0000 (0:00:00.104) 0:03:31.350 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2d4635b4bf\x2d00eb\x2d416a\x2d96e6\x2d73bd53379745.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4635b4bf\\x2d00eb\\x2d416a\\x2d96e6\\x2d73bd53379745.service", "name": "systemd-cryptsetup@luks\\x2d4635b4bf\\x2d00eb\\x2d416a\\x2d96e6\\x2d73bd53379745.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service system-systemd\\x2dcryptsetup.slice systemd-journald.socket systemd-readahead-replay.service -.mount dev-nvme1n1p1.device cryptsetup-pre.target tmp.mount", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-nvme1n1p1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-4635b4bf-00eb-416a-96e6-73bd53379745", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-4635b4bf-00eb-416a-96e6-73bd53379745 /dev/nvme1n1p1 /tmp/storage_testgYdj8nlukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-4635b4bf-00eb-416a-96e6-73bd53379745 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d4635b4bf\\x2d00eb\\x2d416a\\x2d96e6\\x2d73bd53379745.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d4635b4bf\\x2d00eb\\x2d416a\\x2d96e6\\x2d73bd53379745.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d4635b4bf\\x2d00eb\\x2d416a\\x2d96e6\\x2d73bd53379745.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice -.mount", "RequiresMountsFor": "/tmp/storage_testgYdj8nlukskey", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-nvme1n1p1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:53:17 +0000 (0:00:00.490) 0:03:31.840 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:53:18 +0000 (0:00:01.257) 0:03:33.098 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:53:18 +0000 (0:00:00.041) 0:03:33.139 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2d4635b4bf\x2d00eb\x2d416a\x2d96e6\x2d73bd53379745.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4635b4bf\\x2d00eb\\x2d416a\\x2d96e6\\x2d73bd53379745.service", "name": "systemd-cryptsetup@luks\\x2d4635b4bf\\x2d00eb\\x2d416a\\x2d96e6\\x2d73bd53379745.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d4635b4bf\\x2d00eb\\x2d416a\\x2d96e6\\x2d73bd53379745.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d4635b4bf\\x2d00eb\\x2d416a\\x2d96e6\\x2d73bd53379745.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d4635b4bf\\x2d00eb\\x2d416a\\x2d96e6\\x2d73bd53379745.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:53:19 +0000 (0:00:00.477) 0:03:33.616 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:53:19 +0000 (0:00:00.115) 0:03:33.731 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:53:19 +0000 (0:00:00.045) 0:03:33.776 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:53:19 +0000 (0:00:00.040) 0:03:33.817 ********* TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:53:19 +0000 (0:00:00.039) 0:03:33.856 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:53:19 +0000 (0:00:00.454) 0:03:34.310 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount ok: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:53:20 +0000 (0:00:00.350) 0:03:34.661 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:53:20 +0000 (0:00:00.450) 0:03:35.111 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415184.3802953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "21cb29fb3d62ca6991eeedcc24a578b33adb3242", "ctime": 1658415182.3362951, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 8521540, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1658415182.3352952, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "17512807", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:53:21 +0000 (0:00:00.325) 0:03:35.436 ********* TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:53:21 +0000 (0:00:00.024) 0:03:35.461 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:445 Thursday 21 July 2022 14:53:23 +0000 (0:00:01.881) 0:03:37.342 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:451 Thursday 21 July 2022 14:53:23 +0000 (0:00:00.043) 0:03:37.386 ********* included: /tmp/tmpaxjje44y/tests/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:1 Thursday 21 July 2022 14:53:23 +0000 (0:00:00.043) 0:03:37.430 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:6 Thursday 21 July 2022 14:53:23 +0000 (0:00:00.056) 0:03:37.487 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:14 Thursday 21 July 2022 14:53:23 +0000 (0:00:00.038) 0:03:37.525 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "a7b71ad5-d58c-4b30-bb37-33996c066d5e" }, "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "size": "4G", "type": "crypt", "uuid": "b9104d32-76dc-43d2-bef2-50a0546efd82" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "LVM2_member", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "vTKkPp-QTj5-ppfD-mClE-mRD0-D6Qv-qDjyQk" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-14-49-33-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:19 Thursday 21 July 2022 14:53:23 +0000 (0:00:00.322) 0:03:37.847 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.004065", "end": "2022-07-21 10:53:23.804163", "rc": 0, "start": "2022-07-21 10:53:23.800098" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 /dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:24 Thursday 21 July 2022 14:53:23 +0000 (0:00:00.317) 0:03:38.164 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003392", "end": "2022-07-21 10:53:24.115290", "failed_when_result": false, "rc": 0, "start": "2022-07-21 10:53:24.111898" } STDOUT: luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:33 Thursday 21 July 2022 14:53:24 +0000 (0:00:00.307) 0:03:38.472 ********* included: /tmp/tmpaxjje44y/tests/test-verify-pool.yml for /cache/rhel-7.qcow2 => (item={'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'lvm', 'encryption_cipher': None, 'raid_spare_count': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool.yml:5 Thursday 21 July 2022 14:53:24 +0000 (0:00:00.064) 0:03:38.536 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool.yml:18 Thursday 21 July 2022 14:53:24 +0000 (0:00:00.036) 0:03:38.572 ********* included: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml for /cache/rhel-7.qcow2 => (item=members) included: /tmp/tmpaxjje44y/tests/test-verify-pool-volumes.yml for /cache/rhel-7.qcow2 => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:1 Thursday 21 July 2022 14:53:24 +0000 (0:00:00.046) 0:03:38.618 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/nvme1n1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:10 Thursday 21 July 2022 14:53:24 +0000 (0:00:00.056) 0:03:38.675 ********* ok: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/nvme1n1", "pv": "/dev/nvme1n1" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:19 Thursday 21 July 2022 14:53:24 +0000 (0:00:00.364) 0:03:39.039 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:23 Thursday 21 July 2022 14:53:24 +0000 (0:00:00.090) 0:03:39.129 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/nvme1n1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:27 Thursday 21 July 2022 14:53:24 +0000 (0:00:00.087) 0:03:39.216 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:34 Thursday 21 July 2022 14:53:24 +0000 (0:00:00.051) 0:03:39.268 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:38 Thursday 21 July 2022 14:53:24 +0000 (0:00:00.041) 0:03:39.309 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:42 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.053) 0:03:39.363 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:46 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.023) 0:03:39.387 ********* ok: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/nvme1n1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:56 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.045) 0:03:39.433 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-md.yml for /cache/rhel-7.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:6 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.041) 0:03:39.475 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:12 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.024) 0:03:39.499 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:16 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.023) 0:03:39.523 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:20 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.024) 0:03:39.547 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:24 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.023) 0:03:39.570 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:30 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.022) 0:03:39.593 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:36 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.022) 0:03:39.616 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:44 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.023) 0:03:39.640 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:59 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.032) 0:03:39.672 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-lvmraid.yml for /cache/rhel-7.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.041) 0:03:39.714 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-member-lvmraid.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.044) 0:03:39.758 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.033) 0:03:39.791 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.032) 0:03:39.824 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:62 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.032) 0:03:39.856 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-thin.yml for /cache/rhel-7.qcow2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-thin.yml:1 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.047) 0:03:39.904 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml:3 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.044) 0:03:39.948 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml:8 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.024) 0:03:39.972 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml:13 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.023) 0:03:39.996 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml:17 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.022) 0:03:40.018 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:65 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.025) 0:03:40.044 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml for /cache/rhel-7.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.048) 0:03:40.093 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.056) 0:03:40.150 ********* skipping: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "_storage_test_pool_member_path": "/dev/nvme1n1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.029) 0:03:40.179 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml for /cache/rhel-7.qcow2 => (item=/dev/nvme1n1) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.045) 0:03:40.224 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:53:25 +0000 (0:00:00.064) 0:03:40.289 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.097) 0:03:40.387 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.036) 0:03:40.423 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.112) 0:03:40.536 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.039) 0:03:40.575 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.043) 0:03:40.619 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:68 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.038) 0:03:40.657 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-vdo.yml for /cache/rhel-7.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.050) 0:03:40.707 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.046) 0:03:40.754 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.025) 0:03:40.780 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.025) 0:03:40.805 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.025) 0:03:40.831 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.028) 0:03:40.859 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.026) 0:03:40.885 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.023) 0:03:40.909 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.023) 0:03:40.933 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:71 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.040) 0:03:40.973 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.034) 0:03:41.007 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:2 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.041) 0:03:41.049 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:10 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.054) 0:03:41.103 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:6 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.080) 0:03:41.184 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:14 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.043) 0:03:41.227 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1037256, "block_size": 4096, "block_total": 1045504, "block_used": 8248, "device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 4248600576, "size_total": 4282384384, "uuid": "b9104d32-76dc-43d2-bef2-50a0546efd82" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1037256, "block_size": 4096, "block_total": 1045504, "block_used": 8248, "device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 4248600576, "size_total": 4282384384, "uuid": "b9104d32-76dc-43d2-bef2-50a0546efd82" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:28 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.054) 0:03:41.282 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:37 Thursday 21 July 2022 14:53:26 +0000 (0:00:00.050) 0:03:41.333 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:45 Thursday 21 July 2022 14:53:27 +0000 (0:00:00.051) 0:03:41.384 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:54 Thursday 21 July 2022 14:53:27 +0000 (0:00:00.054) 0:03:41.439 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:58 Thursday 21 July 2022 14:53:27 +0000 (0:00:00.025) 0:03:41.464 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:63 Thursday 21 July 2022 14:53:27 +0000 (0:00:00.023) 0:03:41.488 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:75 Thursday 21 July 2022 14:53:27 +0000 (0:00:00.022) 0:03:41.511 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 14:53:27 +0000 (0:00:00.033) 0:03:41.544 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 14:53:27 +0000 (0:00:00.060) 0:03:41.604 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:32 Thursday 21 July 2022 14:53:27 +0000 (0:00:00.058) 0:03:41.662 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:39 Thursday 21 July 2022 14:53:27 +0000 (0:00:00.055) 0:03:41.717 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:49 Thursday 21 July 2022 14:53:27 +0000 (0:00:00.039) 0:03:41.757 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:4 Thursday 21 July 2022 14:53:27 +0000 (0:00:00.034) 0:03:41.792 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:10 Thursday 21 July 2022 14:53:27 +0000 (0:00:00.089) 0:03:41.881 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:4 Thursday 21 July 2022 14:53:27 +0000 (0:00:00.098) 0:03:41.979 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415189.5232952, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415179.2522953, "dev": 5, "device_type": 64512, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 88432, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658415179.2522953, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:10 Thursday 21 July 2022 14:53:28 +0000 (0:00:00.371) 0:03:42.351 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:18 Thursday 21 July 2022 14:53:28 +0000 (0:00:00.041) 0:03:42.393 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:24 Thursday 21 July 2022 14:53:28 +0000 (0:00:00.043) 0:03:42.436 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:28 Thursday 21 July 2022 14:53:28 +0000 (0:00:00.039) 0:03:42.475 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:33 Thursday 21 July 2022 14:53:28 +0000 (0:00:00.023) 0:03:42.499 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 14:53:28 +0000 (0:00:00.040) 0:03:42.539 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415179.3992953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415179.3992953, "dev": 5, "device_type": 64513, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 85748, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658415179.3992953, "nlink": 1, "path": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 14:53:28 +0000 (0:00:00.334) 0:03:42.873 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 14:53:29 +0000 (0:00:00.542) 0:03:43.416 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.043350", "end": "2022-07-21 10:53:29.420871", "rc": 0, "start": "2022-07-21 10:53:29.377521" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: serpent Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 32 bc 7e 80 af 60 a6 cf 4b 03 cc 26 43 40 8b e4 81 7a 45 af MK salt: 30 f5 70 e8 2b e0 7a b4 57 f9 be 52 9f e5 a7 03 de ad ca ba 1d f3 5f b2 fb 89 26 c4 fa 09 5e 42 MK iterations: 22755 UUID: a7b71ad5-d58c-4b30-bb37-33996c066d5e Key Slot 0: ENABLED Iterations: 365612 Salt: dc 95 6b d7 d4 a7 bd b9 44 63 3a 78 a5 e0 a3 22 95 cf 5f 05 e7 97 3a 5d 10 9e 6e cc 9c d7 50 ff Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 14:53:29 +0000 (0:00:00.370) 0:03:43.786 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:30 Thursday 21 July 2022 14:53:29 +0000 (0:00:00.041) 0:03:43.827 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:38 Thursday 21 July 2022 14:53:29 +0000 (0:00:00.053) 0:03:43.881 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 14:53:29 +0000 (0:00:00.044) 0:03:43.925 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:49 Thursday 21 July 2022 14:53:29 +0000 (0:00:00.040) 0:03:43.965 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:55 Thursday 21 July 2022 14:53:29 +0000 (0:00:00.054) 0:03:44.019 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:61 Thursday 21 July 2022 14:53:29 +0000 (0:00:00.023) 0:03:44.043 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 14:53:29 +0000 (0:00:00.024) 0:03:44.067 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:74 Thursday 21 July 2022 14:53:29 +0000 (0:00:00.055) 0:03:44.123 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:79 Thursday 21 July 2022 14:53:29 +0000 (0:00:00.050) 0:03:44.173 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:85 Thursday 21 July 2022 14:53:29 +0000 (0:00:00.054) 0:03:44.228 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:91 Thursday 21 July 2022 14:53:29 +0000 (0:00:00.052) 0:03:44.281 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:97 Thursday 21 July 2022 14:53:29 +0000 (0:00:00.050) 0:03:44.331 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:7 Thursday 21 July 2022 14:53:30 +0000 (0:00:00.035) 0:03:44.367 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:13 Thursday 21 July 2022 14:53:30 +0000 (0:00:00.034) 0:03:44.402 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:17 Thursday 21 July 2022 14:53:30 +0000 (0:00:00.033) 0:03:44.435 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:21 Thursday 21 July 2022 14:53:30 +0000 (0:00:00.035) 0:03:44.471 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:25 Thursday 21 July 2022 14:53:30 +0000 (0:00:00.037) 0:03:44.508 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:31 Thursday 21 July 2022 14:53:30 +0000 (0:00:00.035) 0:03:44.544 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:37 Thursday 21 July 2022 14:53:30 +0000 (0:00:00.036) 0:03:44.580 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:3 Thursday 21 July 2022 14:53:30 +0000 (0:00:00.040) 0:03:44.621 ********* ok: [/cache/rhel-7.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:9 Thursday 21 July 2022 14:53:30 +0000 (0:00:00.325) 0:03:44.946 ********* ok: [/cache/rhel-7.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:15 Thursday 21 July 2022 14:53:30 +0000 (0:00:00.323) 0:03:45.269 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:20 Thursday 21 July 2022 14:53:31 +0000 (0:00:00.100) 0:03:45.370 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:25 Thursday 21 July 2022 14:53:31 +0000 (0:00:00.080) 0:03:45.450 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:28 Thursday 21 July 2022 14:53:31 +0000 (0:00:00.037) 0:03:45.488 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:31 Thursday 21 July 2022 14:53:31 +0000 (0:00:00.039) 0:03:45.527 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:36 Thursday 21 July 2022 14:53:31 +0000 (0:00:00.038) 0:03:45.565 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:39 Thursday 21 July 2022 14:53:31 +0000 (0:00:00.037) 0:03:45.603 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:44 Thursday 21 July 2022 14:53:31 +0000 (0:00:00.075) 0:03:45.678 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:47 Thursday 21 July 2022 14:53:31 +0000 (0:00:00.037) 0:03:45.716 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:50 Thursday 21 July 2022 14:53:31 +0000 (0:00:00.038) 0:03:45.755 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:6 Thursday 21 July 2022 14:53:31 +0000 (0:00:00.056) 0:03:45.812 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.026952", "end": "2022-07-21 10:53:31.805683", "rc": 0, "start": "2022-07-21 10:53:31.778731" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:14 Thursday 21 July 2022 14:53:31 +0000 (0:00:00.355) 0:03:46.167 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:17 Thursday 21 July 2022 14:53:31 +0000 (0:00:00.054) 0:03:46.221 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:22 Thursday 21 July 2022 14:53:31 +0000 (0:00:00.056) 0:03:46.278 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:26 Thursday 21 July 2022 14:53:31 +0000 (0:00:00.040) 0:03:46.318 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:32 Thursday 21 July 2022 14:53:32 +0000 (0:00:00.037) 0:03:46.355 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:36 Thursday 21 July 2022 14:53:32 +0000 (0:00:00.036) 0:03:46.392 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:16 Thursday 21 July 2022 14:53:32 +0000 (0:00:00.040) 0:03:46.433 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:43 Thursday 21 July 2022 14:53:32 +0000 (0:00:00.037) 0:03:46.470 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:53 Thursday 21 July 2022 14:53:32 +0000 (0:00:00.023) 0:03:46.494 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmpaxjje44y/tests/create-test-file.yml:10 Thursday 21 July 2022 14:53:32 +0000 (0:00:00.040) 0:03:46.535 ********* changed: [/cache/rhel-7.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:457 Thursday 21 July 2022 14:53:32 +0000 (0:00:00.360) 0:03:46.895 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:53:32 +0000 (0:00:00.037) 0:03:46.933 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:53:32 +0000 (0:00:00.033) 0:03:46.967 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:53:33 +0000 (0:00:00.427) 0:03:47.394 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:53:33 +0000 (0:00:00.063) 0:03:47.458 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:53:33 +0000 (0:00:00.036) 0:03:47.494 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:53:33 +0000 (0:00:00.038) 0:03:47.532 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:53:33 +0000 (0:00:00.050) 0:03:47.583 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:53:33 +0000 (0:00:00.020) 0:03:47.603 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:53:33 +0000 (0:00:00.691) 0:03:48.295 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:53:33 +0000 (0:00:00.039) 0:03:48.335 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:53:34 +0000 (0:00:00.041) 0:03:48.376 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:53:35 +0000 (0:00:01.208) 0:03:49.584 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:53:35 +0000 (0:00:00.047) 0:03:49.632 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:53:35 +0000 (0:00:00.035) 0:03:49.668 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:53:35 +0000 (0:00:00.041) 0:03:49.709 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:53:35 +0000 (0:00:00.033) 0:03:49.743 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed" ] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:53:36 +0000 (0:00:00.601) 0:03:50.344 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@259:1.service": { "name": "lvm2-pvscan@259:1.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service": { "name": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:53:37 +0000 (0:00:01.022) 0:03:51.367 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:53:37 +0000 (0:00:00.061) 0:03:51.429 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2da7b71ad5\x2dd58c\x2d4b30\x2dbb37\x2d33996c066d5e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "name": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-mapper-foo\\x2dtest1.device systemd-readahead-collect.service cryptsetup-pre.target systemd-journald.socket system-systemd\\x2dcryptsetup.slice systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:53:37 +0000 (0:00:00.491) 0:03:51.921 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e' in safe mode due to encryption removal TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Thursday 21 July 2022 14:53:38 +0000 (0:00:01.125) 0:03:53.046 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': True, 'pools': [{'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, 'raid_spare_count': None, 'raid_disks': [], 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'lvm', 'encryption_cipher': None, 'raid_spare_count': None}], 'volumes': [], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "cannot remove existing formatting on device 'luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e' in safe mode due to encryption removal", '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:53:38 +0000 (0:00:00.039) 0:03:53.086 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2da7b71ad5\x2dd58c\x2d4b30\x2dbb37\x2d33996c066d5e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "name": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "dev-mapper-luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:477 Thursday 21 July 2022 14:53:39 +0000 (0:00:00.477) 0:03:53.564 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:483 Thursday 21 July 2022 14:53:39 +0000 (0:00:00.036) 0:03:53.600 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml:10 Thursday 21 July 2022 14:53:39 +0000 (0:00:00.047) 0:03:53.648 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415212.5272954, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658415212.5272954, "dev": 64513, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1658415212.5272954, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744072429172834", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml:15 Thursday 21 July 2022 14:53:39 +0000 (0:00:00.298) 0:03:53.946 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:494 Thursday 21 July 2022 14:53:39 +0000 (0:00:00.042) 0:03:53.988 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:53:39 +0000 (0:00:00.041) 0:03:54.030 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:53:39 +0000 (0:00:00.034) 0:03:54.065 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:53:40 +0000 (0:00:00.427) 0:03:54.492 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:53:40 +0000 (0:00:00.062) 0:03:54.555 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:53:40 +0000 (0:00:00.036) 0:03:54.592 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:53:40 +0000 (0:00:00.034) 0:03:54.627 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:53:40 +0000 (0:00:00.044) 0:03:54.671 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:53:40 +0000 (0:00:00.019) 0:03:54.690 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:53:41 +0000 (0:00:00.656) 0:03:55.347 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:53:41 +0000 (0:00:00.099) 0:03:55.446 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:53:41 +0000 (0:00:00.091) 0:03:55.538 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:53:42 +0000 (0:00:01.172) 0:03:56.710 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:53:42 +0000 (0:00:00.048) 0:03:56.759 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:53:42 +0000 (0:00:00.035) 0:03:56.795 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:53:42 +0000 (0:00:00.041) 0:03:56.836 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:53:42 +0000 (0:00:00.043) 0:03:56.880 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed" ] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:53:43 +0000 (0:00:00.555) 0:03:57.435 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@259:1.service": { "name": "lvm2-pvscan@259:1.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service": { "name": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:53:44 +0000 (0:00:01.050) 0:03:58.486 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:53:44 +0000 (0:00:00.063) 0:03:58.550 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2da7b71ad5\x2dd58c\x2d4b30\x2dbb37\x2d33996c066d5e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "name": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service systemd-readahead-replay.service systemd-journald.socket system-systemd\\x2dcryptsetup.slice dev-mapper-foo\\x2dtest1.device cryptsetup-pre.target", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:53:44 +0000 (0:00:00.498) 0:03:59.048 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/mapper/foo-test1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:54:46 +0000 (0:01:01.998) 0:05:01.047 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:54:46 +0000 (0:00:00.037) 0:05:01.084 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2da7b71ad5\x2dd58c\x2d4b30\x2dbb37\x2d33996c066d5e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "name": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.device", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:54:47 +0000 (0:00:00.499) 0:05:01.583 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/mapper/foo-test1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:54:47 +0000 (0:00:00.042) 0:05:01.626 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:54:47 +0000 (0:00:00.041) 0:05:01.667 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:54:47 +0000 (0:00:00.040) 0:05:01.708 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:54:47 +0000 (0:00:00.342) 0:05:02.051 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:54:48 +0000 (0:00:00.470) 0:05:02.522 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/foo-test1', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:54:48 +0000 (0:00:00.359) 0:05:02.881 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:54:49 +0000 (0:00:00.456) 0:05:03.337 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415184.3802953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "21cb29fb3d62ca6991eeedcc24a578b33adb3242", "ctime": 1658415182.3362951, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 8521540, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1658415182.3352952, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "17512807", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:54:49 +0000 (0:00:00.325) 0:05:03.663 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'absent', 'password': '-', 'name': 'luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e', 'backing_device': '/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:54:49 +0000 (0:00:00.340) 0:05:04.003 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:510 Thursday 21 July 2022 14:54:50 +0000 (0:00:00.852) 0:05:04.856 ********* included: /tmp/tmpaxjje44y/tests/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:1 Thursday 21 July 2022 14:54:50 +0000 (0:00:00.037) 0:05:04.893 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:6 Thursday 21 July 2022 14:54:50 +0000 (0:00:00.048) 0:05:04.941 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:14 Thursday 21 July 2022 14:54:50 +0000 (0:00:00.035) 0:05:04.977 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "fc446283-9309-40da-8e9a-b91f8ad243a8" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "LVM2_member", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "vTKkPp-QTj5-ppfD-mClE-mRD0-D6Qv-qDjyQk" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-14-49-33-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:19 Thursday 21 July 2022 14:54:50 +0000 (0:00:00.324) 0:05:05.302 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003655", "end": "2022-07-21 10:54:51.252619", "rc": 0, "start": "2022-07-21 10:54:51.248964" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:24 Thursday 21 July 2022 14:54:51 +0000 (0:00:00.310) 0:05:05.613 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003380", "end": "2022-07-21 10:54:51.560089", "failed_when_result": false, "rc": 0, "start": "2022-07-21 10:54:51.556709" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:33 Thursday 21 July 2022 14:54:51 +0000 (0:00:00.307) 0:05:05.920 ********* included: /tmp/tmpaxjje44y/tests/test-verify-pool.yml for /cache/rhel-7.qcow2 => (item={'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/foo-test1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/foo-test1', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'lvm', 'encryption_cipher': None, 'raid_spare_count': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool.yml:5 Thursday 21 July 2022 14:54:51 +0000 (0:00:00.059) 0:05:05.980 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool.yml:18 Thursday 21 July 2022 14:54:51 +0000 (0:00:00.076) 0:05:06.056 ********* included: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml for /cache/rhel-7.qcow2 => (item=members) included: /tmp/tmpaxjje44y/tests/test-verify-pool-volumes.yml for /cache/rhel-7.qcow2 => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:1 Thursday 21 July 2022 14:54:51 +0000 (0:00:00.046) 0:05:06.102 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/nvme1n1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:10 Thursday 21 July 2022 14:54:51 +0000 (0:00:00.053) 0:05:06.155 ********* ok: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/nvme1n1", "pv": "/dev/nvme1n1" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:19 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.315) 0:05:06.471 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:23 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.052) 0:05:06.524 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/nvme1n1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:27 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.050) 0:05:06.574 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:34 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.050) 0:05:06.625 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:38 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.040) 0:05:06.665 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:42 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.053) 0:05:06.718 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:46 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.026) 0:05:06.745 ********* ok: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/nvme1n1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:56 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.040) 0:05:06.786 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-md.yml for /cache/rhel-7.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:6 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.040) 0:05:06.827 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:12 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.024) 0:05:06.851 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:16 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.027) 0:05:06.879 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:20 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.024) 0:05:06.903 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:24 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.024) 0:05:06.928 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:30 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.022) 0:05:06.950 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:36 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.022) 0:05:06.973 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:44 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.029) 0:05:07.002 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:59 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.036) 0:05:07.038 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-lvmraid.yml for /cache/rhel-7.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.042) 0:05:07.081 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-member-lvmraid.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/foo-test1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/foo-test1', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.041) 0:05:07.122 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.025) 0:05:07.147 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.025) 0:05:07.173 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:62 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.025) 0:05:07.199 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-thin.yml for /cache/rhel-7.qcow2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-thin.yml:1 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.043) 0:05:07.242 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/foo-test1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/foo-test1', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml:3 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.040) 0:05:07.283 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml:8 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.022) 0:05:07.305 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml:13 Thursday 21 July 2022 14:54:52 +0000 (0:00:00.021) 0:05:07.327 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml:17 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.021) 0:05:07.349 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:65 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.022) 0:05:07.371 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml for /cache/rhel-7.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.092) 0:05:07.464 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.052) 0:05:07.516 ********* skipping: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "_storage_test_pool_member_path": "/dev/nvme1n1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.029) 0:05:07.546 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml for /cache/rhel-7.qcow2 => (item=/dev/nvme1n1) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.043) 0:05:07.590 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.051) 0:05:07.642 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.052) 0:05:07.694 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.042) 0:05:07.737 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.038) 0:05:07.775 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.036) 0:05:07.812 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.038) 0:05:07.850 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:68 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.040) 0:05:07.890 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-vdo.yml for /cache/rhel-7.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.048) 0:05:07.938 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/foo-test1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/foo-test1', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.047) 0:05:07.985 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.024) 0:05:08.010 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.023) 0:05:08.033 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.024) 0:05:08.058 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.025) 0:05:08.083 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.023) 0:05:08.107 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.022) 0:05:08.130 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.025) 0:05:08.155 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:71 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.033) 0:05:08.189 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.039) 0:05:08.229 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/foo-test1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/foo-test1', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:2 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.042) 0:05:08.271 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:10 Thursday 21 July 2022 14:54:53 +0000 (0:00:00.050) 0:05:08.322 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:6 Thursday 21 July 2022 14:54:54 +0000 (0:00:00.078) 0:05:08.401 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:14 Thursday 21 July 2022 14:54:54 +0000 (0:00:00.044) 0:05:08.445 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1037768, "block_size": 4096, "block_total": 1046016, "block_used": 8248, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 4250697728, "size_total": 4284481536, "uuid": "fc446283-9309-40da-8e9a-b91f8ad243a8" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1037768, "block_size": 4096, "block_total": 1046016, "block_used": 8248, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 4250697728, "size_total": 4284481536, "uuid": "fc446283-9309-40da-8e9a-b91f8ad243a8" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:28 Thursday 21 July 2022 14:54:54 +0000 (0:00:00.056) 0:05:08.502 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:37 Thursday 21 July 2022 14:54:54 +0000 (0:00:00.046) 0:05:08.548 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:45 Thursday 21 July 2022 14:54:54 +0000 (0:00:00.046) 0:05:08.595 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:54 Thursday 21 July 2022 14:54:54 +0000 (0:00:00.050) 0:05:08.646 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:58 Thursday 21 July 2022 14:54:54 +0000 (0:00:00.024) 0:05:08.670 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:63 Thursday 21 July 2022 14:54:54 +0000 (0:00:00.024) 0:05:08.695 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:75 Thursday 21 July 2022 14:54:54 +0000 (0:00:00.024) 0:05:08.719 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 14:54:54 +0000 (0:00:00.120) 0:05:08.839 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 14:54:54 +0000 (0:00:00.072) 0:05:08.911 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:32 Thursday 21 July 2022 14:54:54 +0000 (0:00:00.057) 0:05:08.968 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:39 Thursday 21 July 2022 14:54:54 +0000 (0:00:00.051) 0:05:09.020 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:49 Thursday 21 July 2022 14:54:54 +0000 (0:00:00.034) 0:05:09.055 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:4 Thursday 21 July 2022 14:54:54 +0000 (0:00:00.034) 0:05:09.089 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:10 Thursday 21 July 2022 14:54:54 +0000 (0:00:00.038) 0:05:09.127 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:4 Thursday 21 July 2022 14:54:54 +0000 (0:00:00.041) 0:05:09.169 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415286.6412952, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415286.6412952, "dev": 5, "device_type": 64512, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 103045, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658415286.6412952, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:10 Thursday 21 July 2022 14:54:55 +0000 (0:00:00.323) 0:05:09.492 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:18 Thursday 21 July 2022 14:54:55 +0000 (0:00:00.046) 0:05:09.539 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:24 Thursday 21 July 2022 14:54:55 +0000 (0:00:00.039) 0:05:09.578 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:28 Thursday 21 July 2022 14:54:55 +0000 (0:00:00.040) 0:05:09.619 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:33 Thursday 21 July 2022 14:54:55 +0000 (0:00:00.024) 0:05:09.643 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 14:54:55 +0000 (0:00:00.042) 0:05:09.685 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 14:54:55 +0000 (0:00:00.024) 0:05:09.710 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 14:54:55 +0000 (0:00:00.555) 0:05:10.265 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 14:54:55 +0000 (0:00:00.026) 0:05:10.292 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:30 Thursday 21 July 2022 14:54:55 +0000 (0:00:00.025) 0:05:10.318 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:38 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.055) 0:05:10.373 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.025) 0:05:10.399 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:49 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.026) 0:05:10.425 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:55 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.024) 0:05:10.450 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:61 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.025) 0:05:10.476 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.023) 0:05:10.499 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:74 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.054) 0:05:10.554 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:79 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.049) 0:05:10.603 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:85 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.035) 0:05:10.639 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:91 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.042) 0:05:10.681 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:97 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.036) 0:05:10.717 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:7 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.035) 0:05:10.753 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:13 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.038) 0:05:10.792 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:17 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.039) 0:05:10.831 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:21 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.034) 0:05:10.866 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:25 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.034) 0:05:10.900 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:31 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.033) 0:05:10.934 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:37 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.043) 0:05:10.978 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:3 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.034) 0:05:11.012 ********* ok: [/cache/rhel-7.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:9 Thursday 21 July 2022 14:54:56 +0000 (0:00:00.308) 0:05:11.321 ********* ok: [/cache/rhel-7.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:15 Thursday 21 July 2022 14:54:57 +0000 (0:00:00.331) 0:05:11.653 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:20 Thursday 21 July 2022 14:54:57 +0000 (0:00:00.056) 0:05:11.709 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:25 Thursday 21 July 2022 14:54:57 +0000 (0:00:00.039) 0:05:11.749 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:28 Thursday 21 July 2022 14:54:57 +0000 (0:00:00.045) 0:05:11.794 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:31 Thursday 21 July 2022 14:54:57 +0000 (0:00:00.047) 0:05:11.842 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:36 Thursday 21 July 2022 14:54:57 +0000 (0:00:00.046) 0:05:11.889 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:39 Thursday 21 July 2022 14:54:57 +0000 (0:00:00.037) 0:05:11.926 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:44 Thursday 21 July 2022 14:54:57 +0000 (0:00:00.036) 0:05:11.962 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:47 Thursday 21 July 2022 14:54:57 +0000 (0:00:00.085) 0:05:12.048 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:50 Thursday 21 July 2022 14:54:57 +0000 (0:00:00.110) 0:05:12.158 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:6 Thursday 21 July 2022 14:54:57 +0000 (0:00:00.052) 0:05:12.211 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.020603", "end": "2022-07-21 10:54:58.187436", "rc": 0, "start": "2022-07-21 10:54:58.166833" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:14 Thursday 21 July 2022 14:54:58 +0000 (0:00:00.337) 0:05:12.549 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:17 Thursday 21 July 2022 14:54:58 +0000 (0:00:00.051) 0:05:12.601 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:22 Thursday 21 July 2022 14:54:58 +0000 (0:00:00.055) 0:05:12.656 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:26 Thursday 21 July 2022 14:54:58 +0000 (0:00:00.041) 0:05:12.697 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:32 Thursday 21 July 2022 14:54:58 +0000 (0:00:00.041) 0:05:12.739 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:36 Thursday 21 July 2022 14:54:58 +0000 (0:00:00.037) 0:05:12.777 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:16 Thursday 21 July 2022 14:54:58 +0000 (0:00:00.036) 0:05:12.813 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:43 Thursday 21 July 2022 14:54:58 +0000 (0:00:00.034) 0:05:12.848 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:53 Thursday 21 July 2022 14:54:58 +0000 (0:00:00.021) 0:05:12.870 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmpaxjje44y/tests/create-test-file.yml:10 Thursday 21 July 2022 14:54:58 +0000 (0:00:00.034) 0:05:12.905 ********* changed: [/cache/rhel-7.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:516 Thursday 21 July 2022 14:54:58 +0000 (0:00:00.331) 0:05:13.236 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:54:58 +0000 (0:00:00.038) 0:05:13.275 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:54:58 +0000 (0:00:00.035) 0:05:13.310 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:54:59 +0000 (0:00:00.416) 0:05:13.727 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:54:59 +0000 (0:00:00.060) 0:05:13.787 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:54:59 +0000 (0:00:00.033) 0:05:13.821 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:54:59 +0000 (0:00:00.034) 0:05:13.856 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:54:59 +0000 (0:00:00.044) 0:05:13.900 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:54:59 +0000 (0:00:00.019) 0:05:13.920 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:55:00 +0000 (0:00:00.661) 0:05:14.582 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:55:00 +0000 (0:00:00.037) 0:05:14.619 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:55:00 +0000 (0:00:00.080) 0:05:14.699 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:55:01 +0000 (0:00:01.086) 0:05:15.786 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:55:01 +0000 (0:00:00.047) 0:05:15.833 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:55:01 +0000 (0:00:00.034) 0:05:15.867 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:55:01 +0000 (0:00:00.037) 0:05:15.905 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:55:01 +0000 (0:00:00.032) 0:05:15.938 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed" ] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:55:02 +0000 (0:00:00.571) 0:05:16.509 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@259:1.service": { "name": "lvm2-pvscan@259:1.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service": { "name": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:55:03 +0000 (0:00:01.041) 0:05:17.551 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service" ] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:55:03 +0000 (0:00:00.058) 0:05:17.609 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2da7b71ad5\x2dd58c\x2d4b30\x2dbb37\x2d33996c066d5e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "name": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice systemd-journald.socket cryptsetup-pre.target systemd-readahead-collect.service systemd-readahead-replay.service dev-mapper-foo\\x2dtest1.device", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-a7b71ad5-d58c-4b30-bb37-33996c066d5e ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:55:03 +0000 (0:00:00.479) 0:05:18.089 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [linux-system-roles.storage : failed message] ***************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:99 Thursday 21 July 2022 14:55:04 +0000 (0:00:01.196) 0:05:19.285 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': True, 'pools': [{'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, 'raid_spare_count': None, 'raid_disks': [], 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'lvm', 'encryption_cipher': None, 'raid_spare_count': None}], 'volumes': [], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", '_ansible_no_log': False} TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:55:04 +0000 (0:00:00.040) 0:05:19.325 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2da7b71ad5\x2dd58c\x2d4b30\x2dbb37\x2d33996c066d5e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "name": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2da7b71ad5\\x2dd58c\\x2d4b30\\x2dbb37\\x2d33996c066d5e.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:536 Thursday 21 July 2022 14:55:05 +0000 (0:00:00.478) 0:05:19.804 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:542 Thursday 21 July 2022 14:55:05 +0000 (0:00:00.036) 0:05:19.840 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml:10 Thursday 21 July 2022 14:55:05 +0000 (0:00:00.046) 0:05:19.887 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415298.8702953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658415298.8702953, "dev": 64512, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1658415298.8702953, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744072850305499", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmpaxjje44y/tests/verify-data-preservation.yml:15 Thursday 21 July 2022 14:55:05 +0000 (0:00:00.319) 0:05:20.206 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:553 Thursday 21 July 2022 14:55:05 +0000 (0:00:00.039) 0:05:20.246 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:55:05 +0000 (0:00:00.038) 0:05:20.284 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:55:05 +0000 (0:00:00.034) 0:05:20.318 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:55:06 +0000 (0:00:00.417) 0:05:20.736 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:55:06 +0000 (0:00:00.063) 0:05:20.799 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:55:06 +0000 (0:00:00.035) 0:05:20.835 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:55:06 +0000 (0:00:00.035) 0:05:20.870 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:55:06 +0000 (0:00:00.045) 0:05:20.915 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:55:06 +0000 (0:00:00.021) 0:05:20.937 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:55:07 +0000 (0:00:00.695) 0:05:21.633 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:55:07 +0000 (0:00:00.080) 0:05:21.713 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:55:07 +0000 (0:00:00.082) 0:05:21.796 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:55:08 +0000 (0:00:01.103) 0:05:22.899 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:55:08 +0000 (0:00:00.114) 0:05:23.013 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:55:08 +0000 (0:00:00.034) 0:05:23.048 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:55:08 +0000 (0:00:00.041) 0:05:23.090 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:55:08 +0000 (0:00:00.035) 0:05:23.125 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed" ] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:55:09 +0000 (0:00:00.549) 0:05:23.675 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@259:1.service": { "name": "lvm2-pvscan@259:1.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:55:10 +0000 (0:00:01.014) 0:05:24.689 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:55:10 +0000 (0:00:00.056) 0:05:24.746 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:55:10 +0000 (0:00:00.023) 0:05:24.769 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:55:18 +0000 (0:00:07.706) 0:05:32.476 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:55:18 +0000 (0:00:00.039) 0:05:32.515 ********* TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:55:18 +0000 (0:00:00.021) 0:05:32.537 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:55:18 +0000 (0:00:00.042) 0:05:32.579 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:55:18 +0000 (0:00:00.038) 0:05:32.618 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:55:18 +0000 (0:00:00.036) 0:05:32.655 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/foo-test1', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:55:18 +0000 (0:00:00.355) 0:05:33.010 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:55:19 +0000 (0:00:00.481) 0:05:33.492 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:55:19 +0000 (0:00:00.427) 0:05:33.919 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:55:20 +0000 (0:00:00.474) 0:05:34.394 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415291.5582952, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658415289.6362953, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 12585524, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658415289.6352952, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "213327429", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:55:20 +0000 (0:00:00.367) 0:05:34.761 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'present', 'password': '-', 'name': 'luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18', 'backing_device': '/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "password": "-", "state": "present" } } MSG: line added TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:55:20 +0000 (0:00:00.380) 0:05:35.142 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:569 Thursday 21 July 2022 14:55:21 +0000 (0:00:00.918) 0:05:36.061 ********* included: /tmp/tmpaxjje44y/tests/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:1 Thursday 21 July 2022 14:55:21 +0000 (0:00:00.040) 0:05:36.101 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:6 Thursday 21 July 2022 14:55:21 +0000 (0:00:00.052) 0:05:36.154 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:14 Thursday 21 July 2022 14:55:21 +0000 (0:00:00.035) 0:05:36.190 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "6ec800ad-12e9-451c-a7d5-df95e0ab3a18" }, "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "size": "4G", "type": "crypt", "uuid": "aeffa976-1530-4e14-b5a2-ffdf914b7a5c" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "LVM2_member", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "vTKkPp-QTj5-ppfD-mClE-mRD0-D6Qv-qDjyQk" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-14-49-33-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:19 Thursday 21 July 2022 14:55:22 +0000 (0:00:00.320) 0:05:36.511 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003601", "end": "2022-07-21 10:55:22.468204", "rc": 0, "start": "2022-07-21 10:55:22.464603" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 /dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:24 Thursday 21 July 2022 14:55:22 +0000 (0:00:00.322) 0:05:36.834 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003418", "end": "2022-07-21 10:55:22.773845", "failed_when_result": false, "rc": 0, "start": "2022-07-21 10:55:22.770427" } STDOUT: luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:33 Thursday 21 July 2022 14:55:22 +0000 (0:00:00.305) 0:05:37.139 ********* included: /tmp/tmpaxjje44y/tests/test-verify-pool.yml for /cache/rhel-7.qcow2 => (item={'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'lvm', 'encryption_cipher': None, 'raid_spare_count': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool.yml:5 Thursday 21 July 2022 14:55:22 +0000 (0:00:00.066) 0:05:37.205 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool.yml:18 Thursday 21 July 2022 14:55:22 +0000 (0:00:00.035) 0:05:37.241 ********* included: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml for /cache/rhel-7.qcow2 => (item=members) included: /tmp/tmpaxjje44y/tests/test-verify-pool-volumes.yml for /cache/rhel-7.qcow2 => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:1 Thursday 21 July 2022 14:55:22 +0000 (0:00:00.050) 0:05:37.291 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/nvme1n1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:10 Thursday 21 July 2022 14:55:23 +0000 (0:00:00.054) 0:05:37.345 ********* ok: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/nvme1n1", "pv": "/dev/nvme1n1" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:19 Thursday 21 July 2022 14:55:23 +0000 (0:00:00.316) 0:05:37.662 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:23 Thursday 21 July 2022 14:55:23 +0000 (0:00:00.050) 0:05:37.712 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/nvme1n1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:27 Thursday 21 July 2022 14:55:23 +0000 (0:00:00.058) 0:05:37.770 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:34 Thursday 21 July 2022 14:55:23 +0000 (0:00:00.062) 0:05:37.833 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:38 Thursday 21 July 2022 14:55:23 +0000 (0:00:00.041) 0:05:37.875 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:42 Thursday 21 July 2022 14:55:23 +0000 (0:00:00.092) 0:05:37.967 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:46 Thursday 21 July 2022 14:55:23 +0000 (0:00:00.024) 0:05:37.992 ********* ok: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/nvme1n1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:56 Thursday 21 July 2022 14:55:23 +0000 (0:00:00.039) 0:05:38.032 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-md.yml for /cache/rhel-7.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:6 Thursday 21 July 2022 14:55:23 +0000 (0:00:00.040) 0:05:38.073 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:12 Thursday 21 July 2022 14:55:23 +0000 (0:00:00.024) 0:05:38.098 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:16 Thursday 21 July 2022 14:55:23 +0000 (0:00:00.025) 0:05:38.123 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:20 Thursday 21 July 2022 14:55:23 +0000 (0:00:00.026) 0:05:38.150 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:24 Thursday 21 July 2022 14:55:23 +0000 (0:00:00.027) 0:05:38.177 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:30 Thursday 21 July 2022 14:55:23 +0000 (0:00:00.070) 0:05:38.248 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:36 Thursday 21 July 2022 14:55:23 +0000 (0:00:00.027) 0:05:38.275 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-md.yml:44 Thursday 21 July 2022 14:55:23 +0000 (0:00:00.025) 0:05:38.301 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:59 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.038) 0:05:38.339 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-lvmraid.yml for /cache/rhel-7.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.045) 0:05:38.385 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-member-lvmraid.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.046) 0:05:38.431 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.029) 0:05:38.461 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.030) 0:05:38.491 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:62 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.029) 0:05:38.521 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-thin.yml for /cache/rhel-7.qcow2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-thin.yml:1 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.047) 0:05:38.569 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml:3 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.044) 0:05:38.613 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml:8 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.024) 0:05:38.638 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml:13 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.025) 0:05:38.664 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-thin.yml:17 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.023) 0:05:38.687 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:65 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.025) 0:05:38.713 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml for /cache/rhel-7.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.050) 0:05:38.764 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.051) 0:05:38.815 ********* skipping: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "_storage_test_pool_member_path": "/dev/nvme1n1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.028) 0:05:38.843 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml for /cache/rhel-7.qcow2 => (item=/dev/nvme1n1) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.043) 0:05:38.886 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.055) 0:05:38.941 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.050) 0:05:38.992 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.034) 0:05:39.026 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.033) 0:05:39.060 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.033) 0:05:39.093 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.035) 0:05:39.128 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:68 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.031) 0:05:39.160 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-members-vdo.yml for /cache/rhel-7.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.045) 0:05:39.205 ********* included: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.044) 0:05:39.250 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.025) 0:05:39.275 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.023) 0:05:39.299 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 14:55:24 +0000 (0:00:00.024) 0:05:39.324 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.023) 0:05:39.347 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.023) 0:05:39.370 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.023) 0:05:39.394 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.098) 0:05:39.493 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-members.yml:71 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.034) 0:05:39.527 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.035) 0:05:39.563 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:2 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.042) 0:05:39.606 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:10 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.047) 0:05:39.653 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:6 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.076) 0:05:39.730 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:14 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.040) 0:05:39.770 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1037256, "block_size": 4096, "block_total": 1045504, "block_used": 8248, "device": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 4248600576, "size_total": 4282384384, "uuid": "aeffa976-1530-4e14-b5a2-ffdf914b7a5c" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1037256, "block_size": 4096, "block_total": 1045504, "block_used": 8248, "device": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 4248600576, "size_total": 4282384384, "uuid": "aeffa976-1530-4e14-b5a2-ffdf914b7a5c" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:28 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.055) 0:05:39.825 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:37 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.047) 0:05:39.873 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:45 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.046) 0:05:39.920 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:54 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.050) 0:05:39.970 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:58 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.022) 0:05:39.992 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:63 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.021) 0:05:40.014 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:75 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.021) 0:05:40.035 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.035) 0:05:40.070 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.060) 0:05:40.131 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:32 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.060) 0:05:40.191 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:39 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.062) 0:05:40.253 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:49 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.038) 0:05:40.292 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:4 Thursday 21 July 2022 14:55:25 +0000 (0:00:00.038) 0:05:40.331 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:10 Thursday 21 July 2022 14:55:26 +0000 (0:00:00.040) 0:05:40.371 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:4 Thursday 21 July 2022 14:55:26 +0000 (0:00:00.044) 0:05:40.415 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415317.9242952, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415317.9242952, "dev": 5, "device_type": 64512, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 103045, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658415317.9242952, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:10 Thursday 21 July 2022 14:55:26 +0000 (0:00:00.325) 0:05:40.740 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:18 Thursday 21 July 2022 14:55:26 +0000 (0:00:00.038) 0:05:40.779 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:24 Thursday 21 July 2022 14:55:26 +0000 (0:00:00.043) 0:05:40.822 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:28 Thursday 21 July 2022 14:55:26 +0000 (0:00:00.040) 0:05:40.863 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:33 Thursday 21 July 2022 14:55:26 +0000 (0:00:00.026) 0:05:40.890 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 14:55:26 +0000 (0:00:00.042) 0:05:40.933 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415318.0512953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415318.0512953, "dev": 5, "device_type": 64513, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 112493, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658415318.0512953, "nlink": 1, "path": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 14:55:26 +0000 (0:00:00.330) 0:05:41.264 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 14:55:27 +0000 (0:00:00.592) 0:05:41.856 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.033903", "end": "2022-07-21 10:55:27.881294", "rc": 0, "start": "2022-07-21 10:55:27.847391" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 98 4d a8 06 82 6e 01 9c 94 c5 06 3d 7c 73 bf 6c 31 81 e9 a9 MK salt: 34 82 da fd 74 e8 e7 7a 80 73 ec 33 52 da b9 23 be 53 73 45 b9 f5 9d fe f0 65 da 23 2a 39 83 64 MK iterations: 22946 UUID: 6ec800ad-12e9-451c-a7d5-df95e0ab3a18 Key Slot 0: ENABLED Iterations: 366122 Salt: 43 7d fe 3b 25 cc 46 41 b7 64 40 2b 7c 03 42 56 51 c5 24 6a cb 6c 3c f0 68 2e d9 6c e3 b4 21 00 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 14:55:27 +0000 (0:00:00.388) 0:05:42.245 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:30 Thursday 21 July 2022 14:55:27 +0000 (0:00:00.039) 0:05:42.284 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:38 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.051) 0:05:42.336 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.040) 0:05:42.376 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:49 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.044) 0:05:42.421 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:55 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.023) 0:05:42.445 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:61 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.023) 0:05:42.468 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.023) 0:05:42.492 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:74 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.056) 0:05:42.548 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:79 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.048) 0:05:42.597 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:85 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.050) 0:05:42.647 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:91 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.053) 0:05:42.701 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:97 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.050) 0:05:42.751 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:7 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.034) 0:05:42.785 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:13 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.037) 0:05:42.823 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:17 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.036) 0:05:42.859 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:21 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.035) 0:05:42.895 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:25 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.035) 0:05:42.930 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:31 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.035) 0:05:42.965 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:37 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.034) 0:05:43.000 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:3 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.034) 0:05:43.034 ********* ok: [/cache/rhel-7.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:9 Thursday 21 July 2022 14:55:28 +0000 (0:00:00.296) 0:05:43.331 ********* ok: [/cache/rhel-7.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:15 Thursday 21 July 2022 14:55:29 +0000 (0:00:00.318) 0:05:43.649 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:20 Thursday 21 July 2022 14:55:29 +0000 (0:00:00.052) 0:05:43.702 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:25 Thursday 21 July 2022 14:55:29 +0000 (0:00:00.036) 0:05:43.738 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:28 Thursday 21 July 2022 14:55:29 +0000 (0:00:00.036) 0:05:43.775 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:31 Thursday 21 July 2022 14:55:29 +0000 (0:00:00.036) 0:05:43.811 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:36 Thursday 21 July 2022 14:55:29 +0000 (0:00:00.041) 0:05:43.852 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:39 Thursday 21 July 2022 14:55:29 +0000 (0:00:00.037) 0:05:43.890 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:44 Thursday 21 July 2022 14:55:29 +0000 (0:00:00.036) 0:05:43.927 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:47 Thursday 21 July 2022 14:55:29 +0000 (0:00:00.042) 0:05:43.969 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:50 Thursday 21 July 2022 14:55:29 +0000 (0:00:00.043) 0:05:44.013 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:6 Thursday 21 July 2022 14:55:29 +0000 (0:00:00.058) 0:05:44.071 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.022848", "end": "2022-07-21 10:55:30.083407", "rc": 0, "start": "2022-07-21 10:55:30.060559" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:14 Thursday 21 July 2022 14:55:30 +0000 (0:00:00.372) 0:05:44.444 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:17 Thursday 21 July 2022 14:55:30 +0000 (0:00:00.054) 0:05:44.499 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:22 Thursday 21 July 2022 14:55:30 +0000 (0:00:00.099) 0:05:44.598 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:26 Thursday 21 July 2022 14:55:30 +0000 (0:00:00.046) 0:05:44.644 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:32 Thursday 21 July 2022 14:55:30 +0000 (0:00:00.047) 0:05:44.691 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:36 Thursday 21 July 2022 14:55:30 +0000 (0:00:00.039) 0:05:44.731 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:16 Thursday 21 July 2022 14:55:30 +0000 (0:00:00.036) 0:05:44.767 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:43 Thursday 21 July 2022 14:55:30 +0000 (0:00:00.166) 0:05:44.934 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:53 Thursday 21 July 2022 14:55:30 +0000 (0:00:00.025) 0:05:44.959 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:571 Thursday 21 July 2022 14:55:30 +0000 (0:00:00.036) 0:05:44.995 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:55:30 +0000 (0:00:00.048) 0:05:45.044 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:55:30 +0000 (0:00:00.038) 0:05:45.082 ********* ok: [/cache/rhel-7.qcow2] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:55:31 +0000 (0:00:00.415) 0:05:45.498 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:55:31 +0000 (0:00:00.066) 0:05:45.564 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:55:31 +0000 (0:00:00.035) 0:05:45.600 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:55:31 +0000 (0:00:00.042) 0:05:45.643 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:55:31 +0000 (0:00:00.051) 0:05:45.694 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:55:31 +0000 (0:00:00.023) 0:05:45.718 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:55:32 +0000 (0:00:00.749) 0:05:46.467 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:55:32 +0000 (0:00:00.037) 0:05:46.505 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": [ { "disks": [ "nvme1n1" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:55:32 +0000 (0:00:00.037) 0:05:46.542 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:55:33 +0000 (0:00:01.098) 0:05:47.640 ********* included: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:55:33 +0000 (0:00:00.050) 0:05:47.691 ********* TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:55:33 +0000 (0:00:00.037) 0:05:47.728 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:55:33 +0000 (0:00:00.042) 0:05:47.771 ********* TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:55:33 +0000 (0:00:00.049) 0:05:47.820 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [] } TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:55:34 +0000 (0:00:00.529) 0:05:48.350 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@259:1.service": { "name": "lvm2-pvscan@259:1.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:55:35 +0000 (0:00:01.099) 0:05:49.450 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:55:35 +0000 (0:00:00.060) 0:05:49.511 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:55:35 +0000 (0:00:00.023) 0:05:49.535 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/nvme1n1", "_mount_id": "UUID=vTKkPp-QTj5-ppfD-mClE-mRD0-D6Qv-qDjyQk", "_raw_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:56:07 +0000 (0:00:32.030) 0:06:21.565 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:56:07 +0000 (0:00:00.044) 0:06:21.610 ********* TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:56:07 +0000 (0:00:00.025) 0:06:21.636 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/nvme1n1", "_mount_id": "UUID=vTKkPp-QTj5-ppfD-mClE-mRD0-D6Qv-qDjyQk", "_raw_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:56:07 +0000 (0:00:00.045) 0:06:21.681 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:56:07 +0000 (0:00:00.044) 0:06:21.726 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/nvme1n1", "_mount_id": "UUID=vTKkPp-QTj5-ppfD-mClE-mRD0-D6Qv-qDjyQk", "_raw_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:56:07 +0000 (0:00:00.040) 0:06:21.767 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:56:07 +0000 (0:00:00.337) 0:06:22.104 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:56:08 +0000 (0:00:00.473) 0:06:22.578 ********* TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:56:08 +0000 (0:00:00.054) 0:06:22.633 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:56:08 +0000 (0:00:00.455) 0:06:23.088 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415322.7722952, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "82ad5e3e620789682bbd90017138e1af4dbb3a54", "ctime": 1658415320.7712953, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 8521540, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1658415320.7712953, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1339537469", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:56:09 +0000 (0:00:00.347) 0:06:23.435 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'absent', 'password': '-', 'name': 'luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18', 'backing_device': '/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6ec800ad-12e9-451c-a7d5-df95e0ab3a18", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:56:09 +0000 (0:00:00.362) 0:06:23.797 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/tests_luks.yml:581 Thursday 21 July 2022 14:56:10 +0000 (0:00:00.841) 0:06:24.639 ********* included: /tmp/tmpaxjje44y/tests/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:1 Thursday 21 July 2022 14:56:10 +0000 (0:00:00.043) 0:06:24.682 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:6 Thursday 21 July 2022 14:56:10 +0000 (0:00:00.038) 0:06:24.721 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/nvme1n1", "_mount_id": "UUID=vTKkPp-QTj5-ppfD-mClE-mRD0-D6Qv-qDjyQk", "_raw_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:14 Thursday 21 July 2022 14:56:10 +0000 (0:00:00.109) 0:06:24.830 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-14-49-33-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:19 Thursday 21 July 2022 14:56:10 +0000 (0:00:00.321) 0:06:25.152 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.004216", "end": "2022-07-21 10:56:11.115443", "rc": 0, "start": "2022-07-21 10:56:11.111227" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:24 Thursday 21 July 2022 14:56:11 +0000 (0:00:00.333) 0:06:25.485 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003310", "end": "2022-07-21 10:56:11.480550", "failed_when_result": false, "rc": 0, "start": "2022-07-21 10:56:11.477240" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:33 Thursday 21 July 2022 14:56:11 +0000 (0:00:00.353) 0:06:25.839 ********* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:43 Thursday 21 July 2022 14:56:11 +0000 (0:00:00.024) 0:06:25.863 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/nvme1n1', 'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'lvmpv', 'mount_options': 'defaults', '_device': '/dev/nvme1n1', 'size': 10737418240, 'mount_point': None, 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'absent', 'vdo_pool_size': None, 'thin_pool_name': None, 'type': 'disk', 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': 'UUID=vTKkPp-QTj5-ppfD-mClE-mRD0-D6Qv-qDjyQk', 'raid_spare_count': None, 'name': 'foo', 'cache_mode': None, 'cache_devices': [], 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': None, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'thin_pool_size': None, 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:2 Thursday 21 July 2022 14:56:11 +0000 (0:00:00.062) 0:06:25.925 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:10 Thursday 21 July 2022 14:56:11 +0000 (0:00:00.051) 0:06:25.977 ********* included: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:6 Thursday 21 July 2022 14:56:11 +0000 (0:00:00.079) 0:06:26.056 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/nvme1n1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:14 Thursday 21 July 2022 14:56:11 +0000 (0:00:00.044) 0:06:26.101 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:28 Thursday 21 July 2022 14:56:11 +0000 (0:00:00.056) 0:06:26.157 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:37 Thursday 21 July 2022 14:56:11 +0000 (0:00:00.034) 0:06:26.191 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:45 Thursday 21 July 2022 14:56:11 +0000 (0:00:00.059) 0:06:26.251 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:54 Thursday 21 July 2022 14:56:11 +0000 (0:00:00.044) 0:06:26.295 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:58 Thursday 21 July 2022 14:56:11 +0000 (0:00:00.027) 0:06:26.323 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:63 Thursday 21 July 2022 14:56:12 +0000 (0:00:00.026) 0:06:26.349 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-mount.yml:75 Thursday 21 July 2022 14:56:12 +0000 (0:00:00.025) 0:06:26.375 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 14:56:12 +0000 (0:00:00.037) 0:06:26.413 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 14:56:12 +0000 (0:00:00.060) 0:06:26.474 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:32 Thursday 21 July 2022 14:56:12 +0000 (0:00:00.021) 0:06:26.495 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:39 Thursday 21 July 2022 14:56:12 +0000 (0:00:00.053) 0:06:26.549 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fstab.yml:49 Thursday 21 July 2022 14:56:12 +0000 (0:00:00.037) 0:06:26.586 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:4 Thursday 21 July 2022 14:56:12 +0000 (0:00:00.037) 0:06:26.623 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-fs.yml:10 Thursday 21 July 2022 14:56:12 +0000 (0:00:00.026) 0:06:26.650 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:4 Thursday 21 July 2022 14:56:12 +0000 (0:00:00.025) 0:06:26.675 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658415367.1302953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415367.1302953, "dev": 5, "device_type": 66305, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 9660, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1658415367.1302953, "nlink": 1, "path": "/dev/nvme1n1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:10 Thursday 21 July 2022 14:56:12 +0000 (0:00:00.321) 0:06:26.997 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:18 Thursday 21 July 2022 14:56:12 +0000 (0:00:00.039) 0:06:27.037 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:24 Thursday 21 July 2022 14:56:12 +0000 (0:00:00.023) 0:06:27.060 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:28 Thursday 21 July 2022 14:56:12 +0000 (0:00:00.037) 0:06:27.098 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-device.yml:33 Thursday 21 July 2022 14:56:12 +0000 (0:00:00.029) 0:06:27.128 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 14:56:12 +0000 (0:00:00.026) 0:06:27.154 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 14:56:12 +0000 (0:00:00.025) 0:06:27.180 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 14:56:13 +0000 (0:00:00.555) 0:06:27.735 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 14:56:13 +0000 (0:00:00.026) 0:06:27.762 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:30 Thursday 21 July 2022 14:56:13 +0000 (0:00:00.023) 0:06:27.786 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:38 Thursday 21 July 2022 14:56:13 +0000 (0:00:00.022) 0:06:27.808 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 14:56:13 +0000 (0:00:00.022) 0:06:27.831 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:49 Thursday 21 July 2022 14:56:13 +0000 (0:00:00.025) 0:06:27.857 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:55 Thursday 21 July 2022 14:56:13 +0000 (0:00:00.027) 0:06:27.884 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:61 Thursday 21 July 2022 14:56:13 +0000 (0:00:00.026) 0:06:27.911 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 14:56:13 +0000 (0:00:00.023) 0:06:27.935 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:74 Thursday 21 July 2022 14:56:13 +0000 (0:00:00.103) 0:06:28.038 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:79 Thursday 21 July 2022 14:56:13 +0000 (0:00:00.094) 0:06:28.133 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:85 Thursday 21 July 2022 14:56:13 +0000 (0:00:00.036) 0:06:28.169 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:91 Thursday 21 July 2022 14:56:13 +0000 (0:00:00.036) 0:06:28.206 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-encryption.yml:97 Thursday 21 July 2022 14:56:13 +0000 (0:00:00.036) 0:06:28.243 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:7 Thursday 21 July 2022 14:56:13 +0000 (0:00:00.033) 0:06:28.276 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:13 Thursday 21 July 2022 14:56:13 +0000 (0:00:00.034) 0:06:28.310 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:17 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.033) 0:06:28.344 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:21 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.035) 0:06:28.380 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:25 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.033) 0:06:28.413 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:31 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.033) 0:06:28.447 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-md.yml:37 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.033) 0:06:28.480 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:3 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.040) 0:06:28.520 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:9 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.023) 0:06:28.544 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:15 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.035) 0:06:28.580 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:20 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.033) 0:06:28.613 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:25 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.036) 0:06:28.650 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:28 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.034) 0:06:28.685 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:31 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.033) 0:06:28.718 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:36 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.033) 0:06:28.752 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:39 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.035) 0:06:28.787 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:44 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.033) 0:06:28.821 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:47 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.034) 0:06:28.855 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-size.yml:50 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.035) 0:06:28.891 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:6 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.029) 0:06:28.920 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:14 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.025) 0:06:28.945 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:17 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.025) 0:06:28.971 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:22 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.028) 0:06:29.000 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:26 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.027) 0:06:29.027 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:32 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.024) 0:06:29.052 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume-cache.yml:36 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.025) 0:06:29.078 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmpaxjje44y/tests/test-verify-volume.yml:16 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.025) 0:06:29.104 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmpaxjje44y/tests/verify-role-results.yml:53 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.039) 0:06:29.143 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-7.qcow2 : ok=1178 changed=63 unreachable=0 failed=9 skipped=670 rescued=9 ignored=0 Thursday 21 July 2022 14:56:14 +0000 (0:00:00.050) 0:06:29.193 ********* =============================================================================== linux-system-roles.storage : manage the pools and volumes to match the specified state -- 62.00s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state -- 32.03s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : make sure blivet is available -------------- 8.97s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 8.20s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 7.85s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 7.71s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 7.56s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 7.43s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 7.42s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : Update facts ------------------------------- 1.91s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : Update facts ------------------------------- 1.88s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.55s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : make sure required packages are installed --- 1.49s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 set up internal repositories -------------------------------------------- 1.43s /cache/rhel-7_setup.yml:5 ----------------------------------------------------- linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.41s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.26s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get required packages ---------------------- 1.21s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.20s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get required packages ---------------------- 1.17s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : get service facts -------------------------- 1.16s /tmp/tmpaxjje44y/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 ansible-playbook [core 2.12.6] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /tmp/tmp5bkr4li_ executable location = /usr/bin/ansible-playbook python version = 3.9.13 (main, May 18 2022, 00:00:00) [GCC 11.3.1 20220421 (Red Hat 11.3.1-2)] jinja version = 2.11.3 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: rhel-7_setup.yml ***************************************************** 1 plays in /cache/rhel-7_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-7_setup.yml:5 Thursday 21 July 2022 17:58:09 +0000 (0:00:00.017) 0:00:00.017 ********* changed: [/cache/rhel-7.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } changed: [/cache/rhel-7.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } changed: [/cache/rhel-7.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } changed: [/cache/rhel-7.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } changed: [/cache/rhel-7.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-7.qcow2 : ok=1 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Thursday 21 July 2022 17:58:10 +0000 (0:00:01.372) 0:00:01.389 ********* =============================================================================== set up internal repositories -------------------------------------------- 1.37s /cache/rhel-7_setup.yml:5 ----------------------------------------------------- statically imported: /tmp/tmptomayb7j/tests/storage/create-test-file.yml statically imported: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml statically imported: /tmp/tmptomayb7j/tests/storage/create-test-file.yml statically imported: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml statically imported: /tmp/tmptomayb7j/tests/storage/create-test-file.yml statically imported: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml statically imported: /tmp/tmptomayb7j/tests/storage/create-test-file.yml statically imported: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml statically imported: /tmp/tmptomayb7j/tests/storage/create-test-file.yml statically imported: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml statically imported: /tmp/tmptomayb7j/tests/storage/create-test-file.yml statically imported: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml PLAYBOOK: tests_luks_nvme_generated.yml **************************************** 2 plays in /tmp/tmptomayb7j/tests/storage/tests_luks_nvme_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/tests_luks_nvme_generated.yml:3 Thursday 21 July 2022 17:58:11 +0000 (0:00:00.050) 0:00:01.439 ********* ok: [/cache/rhel-7.qcow2] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/tests_luks_nvme_generated.yml:7 Thursday 21 July 2022 17:58:12 +0000 (0:00:01.022) 0:00:02.462 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_use_interface": "nvme" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:2 Thursday 21 July 2022 17:58:12 +0000 (0:00:00.051) 0:00:02.514 ********* ok: [/cache/rhel-7.qcow2] META: ran handlers TASK [include_role : fedora.linux_system_roles.storage] ************************ task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:11 Thursday 21 July 2022 17:58:12 +0000 (0:00:00.757) 0:00:03.271 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 17:58:12 +0000 (0:00:00.035) 0:00:03.307 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 17:58:12 +0000 (0:00:00.031) 0:00:03.338 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 17:58:13 +0000 (0:00:00.408) 0:00:03.747 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 17:58:13 +0000 (0:00:00.078) 0:00:03.825 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 17:58:13 +0000 (0:00:00.028) 0:00:03.854 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 17:58:13 +0000 (0:00:00.031) 0:00:03.885 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 17:58:13 +0000 (0:00:00.053) 0:00:03.938 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 17:58:13 +0000 (0:00:00.018) 0:00:03.956 ********* changed: [/cache/rhel-7.qcow2] => { "changed": true, "changes": { "installed": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "rc": 0, "results": [ "Loaded plugins: search-disabled-repos\nResolving Dependencies\n--> Running transaction check\n---> Package libblockdev-crypto.x86_64 0:2.18-5.el7 will be installed\n--> Processing Dependency: libblockdev-utils(x86-64) = 2.18-5.el7 for package: libblockdev-crypto-2.18-5.el7.x86_64\n--> Processing Dependency: libvolume_key.so.1()(64bit) for package: libblockdev-crypto-2.18-5.el7.x86_64\n--> Processing Dependency: libbd_utils.so.2()(64bit) for package: libblockdev-crypto-2.18-5.el7.x86_64\n---> Package libblockdev-dm.x86_64 0:2.18-5.el7 will be installed\n--> Processing Dependency: libdmraid.so.1(Base)(64bit) for package: libblockdev-dm-2.18-5.el7.x86_64\n--> Processing Dependency: dmraid for package: libblockdev-dm-2.18-5.el7.x86_64\n--> Processing Dependency: libdmraid.so.1()(64bit) for package: libblockdev-dm-2.18-5.el7.x86_64\n---> Package libblockdev-lvm.x86_64 0:2.18-5.el7 will be installed\n--> Processing Dependency: lvm2 for package: libblockdev-lvm-2.18-5.el7.x86_64\n--> Processing Dependency: device-mapper-persistent-data for package: libblockdev-lvm-2.18-5.el7.x86_64\n---> Package libblockdev-mdraid.x86_64 0:2.18-5.el7 will be installed\n--> Processing Dependency: mdadm for package: libblockdev-mdraid-2.18-5.el7.x86_64\n--> Processing Dependency: libbytesize.so.1()(64bit) for package: libblockdev-mdraid-2.18-5.el7.x86_64\n---> Package libblockdev-swap.x86_64 0:2.18-5.el7 will be installed\n---> Package python-enum34.noarch 0:1.0.4-1.el7 will be installed\n---> Package python2-blivet3.noarch 1:3.1.3-3.el7 will be installed\n--> Processing Dependency: blivet3-data = 1:3.1.3-3.el7 for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Processing Dependency: python2-bytesize >= 0.3 for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Processing Dependency: python2-blockdev >= 2.17 for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Processing Dependency: pyparted >= 3.9 for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Processing Dependency: python2-hawkey for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Processing Dependency: lsof for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Running transaction check\n---> Package blivet3-data.noarch 1:3.1.3-3.el7 will be installed\n---> Package device-mapper-persistent-data.x86_64 0:0.8.5-3.el7_9.2 will be installed\n--> Processing Dependency: libaio.so.1(LIBAIO_0.4)(64bit) for package: device-mapper-persistent-data-0.8.5-3.el7_9.2.x86_64\n--> Processing Dependency: libaio.so.1(LIBAIO_0.1)(64bit) for package: device-mapper-persistent-data-0.8.5-3.el7_9.2.x86_64\n--> Processing Dependency: libaio.so.1()(64bit) for package: device-mapper-persistent-data-0.8.5-3.el7_9.2.x86_64\n---> Package dmraid.x86_64 0:1.0.0.rc16-28.el7 will be installed\n--> Processing Dependency: libdevmapper-event.so.1.02(Base)(64bit) for package: dmraid-1.0.0.rc16-28.el7.x86_64\n--> Processing Dependency: dmraid-events for package: dmraid-1.0.0.rc16-28.el7.x86_64\n--> Processing Dependency: libdevmapper-event.so.1.02()(64bit) for package: dmraid-1.0.0.rc16-28.el7.x86_64\n---> Package libblockdev-utils.x86_64 0:2.18-5.el7 will be installed\n---> Package libbytesize.x86_64 0:1.2-1.el7 will be installed\n--> Processing Dependency: libmpfr.so.4()(64bit) for package: libbytesize-1.2-1.el7.x86_64\n---> Package lsof.x86_64 0:4.87-6.el7 will be installed\n---> Package lvm2.x86_64 7:2.02.187-6.el7_9.5 will be installed\n--> Processing Dependency: lvm2-libs = 7:2.02.187-6.el7_9.5 for package: 7:lvm2-2.02.187-6.el7_9.5.x86_64\n--> Processing Dependency: liblvm2app.so.2.2(Base)(64bit) for package: 7:lvm2-2.02.187-6.el7_9.5.x86_64\n--> Processing Dependency: liblvm2app.so.2.2()(64bit) for package: 7:lvm2-2.02.187-6.el7_9.5.x86_64\n---> Package mdadm.x86_64 0:4.1-9.el7_9 will be installed\n--> Processing Dependency: libreport-filesystem for package: mdadm-4.1-9.el7_9.x86_64\n---> Package pyparted.x86_64 1:3.9-15.el7 will be installed\n---> Package python2-blockdev.x86_64 0:2.18-5.el7 will be installed\n--> Processing Dependency: libblockdev(x86-64) = 2.18-5.el7 for package: python2-blockdev-2.18-5.el7.x86_64\n---> Package python2-bytesize.x86_64 0:1.2-1.el7 will be installed\n---> Package python2-hawkey.x86_64 0:0.22.5-2.el7_9 will be installed\n--> Processing Dependency: libdnf(x86-64) = 0.22.5-2.el7_9 for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: python2-libdnf = 0.22.5-2.el7_9 for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libsolv.so.0(SOLV_1.0)(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libsolvext.so.0(SOLV_1.0)(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libdnf.so.2()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libjson-glib-1.0.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libmodulemd.so.1()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: librepo.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: librhsm.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libsolv.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libsolvext.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n---> Package volume_key-libs.x86_64 0:0.3.9-9.el7 will be installed\n--> Running transaction check\n---> Package device-mapper-event-libs.x86_64 7:1.02.170-6.el7_9.5 will be installed\n---> Package dmraid-events.x86_64 0:1.0.0.rc16-28.el7 will be installed\n--> Processing Dependency: sgpio for package: dmraid-events-1.0.0.rc16-28.el7.x86_64\n--> Processing Dependency: device-mapper-event for package: dmraid-events-1.0.0.rc16-28.el7.x86_64\n---> Package json-glib.x86_64 0:1.4.2-2.el7 will be installed\n---> Package libaio.x86_64 0:0.3.109-13.el7 will be installed\n---> Package libblockdev.x86_64 0:2.18-5.el7 will be installed\n---> Package libdnf.x86_64 0:0.22.5-2.el7_9 will be installed\n---> Package libmodulemd.x86_64 0:1.6.3-1.el7 will be installed\n---> Package librepo.x86_64 0:1.8.1-8.el7_9 will be installed\n---> Package libreport-filesystem.x86_64 0:2.1.11-53.el7 will be installed\n---> Package librhsm.x86_64 0:0.0.3-3.el7_9 will be installed\n---> Package libsolv.x86_64 0:0.6.34-4.el7 will be installed\n---> Package lvm2-libs.x86_64 7:2.02.187-6.el7_9.5 will be installed\n---> Package mpfr.x86_64 0:3.1.1-4.el7 will be installed\n---> Package python2-libdnf.x86_64 0:0.22.5-2.el7_9 will be installed\n--> Running transaction check\n---> Package device-mapper-event.x86_64 7:1.02.170-6.el7_9.5 will be installed\n---> Package sgpio.x86_64 0:1.2.0.10-13.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nInstalling:\n libblockdev-crypto x86_64 2.18-5.el7 rhel 60 k\n libblockdev-dm x86_64 2.18-5.el7 rhel-optional 54 k\n libblockdev-lvm x86_64 2.18-5.el7 rhel 62 k\n libblockdev-mdraid x86_64 2.18-5.el7 rhel 57 k\n libblockdev-swap x86_64 2.18-5.el7 rhel 52 k\n python-enum34 noarch 1.0.4-1.el7 rhel 52 k\n python2-blivet3 noarch 1:3.1.3-3.el7 rhel 851 k\nInstalling for dependencies:\n blivet3-data noarch 1:3.1.3-3.el7 rhel 77 k\n device-mapper-event\n x86_64 7:1.02.170-6.el7_9.5 rhel 192 k\n device-mapper-event-libs\n x86_64 7:1.02.170-6.el7_9.5 rhel 192 k\n device-mapper-persistent-data\n x86_64 0.8.5-3.el7_9.2 rhel 423 k\n dmraid x86_64 1.0.0.rc16-28.el7 rhel 151 k\n dmraid-events x86_64 1.0.0.rc16-28.el7 rhel 21 k\n json-glib x86_64 1.4.2-2.el7 rhel 134 k\n libaio x86_64 0.3.109-13.el7 rhel 24 k\n libblockdev x86_64 2.18-5.el7 rhel 119 k\n libblockdev-utils x86_64 2.18-5.el7 rhel 59 k\n libbytesize x86_64 1.2-1.el7 rhel 52 k\n libdnf x86_64 0.22.5-2.el7_9 rhel-7-server-extras-rpms 536 k\n libmodulemd x86_64 1.6.3-1.el7 rhel-7-server-extras-rpms 153 k\n librepo x86_64 1.8.1-8.el7_9 rhel 82 k\n libreport-filesystem\n x86_64 2.1.11-53.el7 rhel 41 k\n librhsm x86_64 0.0.3-3.el7_9 rhel-7-server-extras-rpms 28 k\n libsolv x86_64 0.6.34-4.el7 rhel 329 k\n lsof x86_64 4.87-6.el7 rhel 331 k\n lvm2 x86_64 7:2.02.187-6.el7_9.5 rhel 1.3 M\n lvm2-libs x86_64 7:2.02.187-6.el7_9.5 rhel 1.1 M\n mdadm x86_64 4.1-9.el7_9 rhel 440 k\n mpfr x86_64 3.1.1-4.el7 rhel 203 k\n pyparted x86_64 1:3.9-15.el7 rhel 195 k\n python2-blockdev x86_64 2.18-5.el7 rhel 61 k\n python2-bytesize x86_64 1.2-1.el7 rhel 22 k\n python2-hawkey x86_64 0.22.5-2.el7_9 rhel-7-server-extras-rpms 71 k\n python2-libdnf x86_64 0.22.5-2.el7_9 rhel-7-server-extras-rpms 611 k\n sgpio x86_64 1.2.0.10-13.el7 rhel 14 k\n volume_key-libs x86_64 0.3.9-9.el7 rhel 141 k\n\nTransaction Summary\n================================================================================\nInstall 7 Packages (+29 Dependent packages)\n\nTotal download size: 8.2 M\nInstalled size: 24 M\nDownloading packages:\n--------------------------------------------------------------------------------\nTotal 18 MB/s | 8.2 MB 00:00 \nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : libblockdev-utils-2.18-5.el7.x86_64 1/36 \n Installing : 7:device-mapper-event-libs-1.02.170-6.el7_9.5.x86_64 2/36 \n Installing : json-glib-1.4.2-2.el7.x86_64 3/36 \n Installing : librhsm-0.0.3-3.el7_9.x86_64 4/36 \n Installing : libsolv-0.6.34-4.el7.x86_64 5/36 \n Installing : libaio-0.3.109-13.el7.x86_64 6/36 \n Installing : librepo-1.8.1-8.el7_9.x86_64 7/36 \n Installing : libmodulemd-1.6.3-1.el7.x86_64 8/36 \n Installing : libdnf-0.22.5-2.el7_9.x86_64 9/36 \n Installing : device-mapper-persistent-data-0.8.5-3.el7_9.2.x86_64 10/36 \n Installing : 7:device-mapper-event-1.02.170-6.el7_9.5.x86_64 11/36 \n Installing : 7:lvm2-libs-2.02.187-6.el7_9.5.x86_64 12/36 \n Installing : 7:lvm2-2.02.187-6.el7_9.5.x86_64 13/36 \n Installing : python2-libdnf-0.22.5-2.el7_9.x86_64 14/36 \n Installing : python2-hawkey-0.22.5-2.el7_9.x86_64 15/36 \n Installing : libblockdev-2.18-5.el7.x86_64 16/36 \n Installing : python2-blockdev-2.18-5.el7.x86_64 17/36 \n Installing : 1:pyparted-3.9-15.el7.x86_64 18/36 \n Installing : sgpio-1.2.0.10-13.el7.x86_64 19/36 \n Installing : dmraid-1.0.0.rc16-28.el7.x86_64 20/36 \n Installing : dmraid-events-1.0.0.rc16-28.el7.x86_64 21/36 \n Installing : volume_key-libs-0.3.9-9.el7.x86_64 22/36 \n Installing : mpfr-3.1.1-4.el7.x86_64 23/36 \n Installing : libbytesize-1.2-1.el7.x86_64 24/36 \n Installing : python2-bytesize-1.2-1.el7.x86_64 25/36 \n Installing : libreport-filesystem-2.1.11-53.el7.x86_64 26/36 \n Installing : mdadm-4.1-9.el7_9.x86_64 27/36 \n Installing : 1:blivet3-data-3.1.3-3.el7.noarch 28/36 \n Installing : lsof-4.87-6.el7.x86_64 29/36 \n Installing : 1:python2-blivet3-3.1.3-3.el7.noarch 30/36 \n Installing : libblockdev-mdraid-2.18-5.el7.x86_64 31/36 \n Installing : libblockdev-crypto-2.18-5.el7.x86_64 32/36 \n Installing : libblockdev-dm-2.18-5.el7.x86_64 33/36 \n Installing : libblockdev-lvm-2.18-5.el7.x86_64 34/36 \n Installing : libblockdev-swap-2.18-5.el7.x86_64 35/36 \n Installing : python-enum34-1.0.4-1.el7.noarch 36/36 \n Verifying : 7:device-mapper-event-1.02.170-6.el7_9.5.x86_64 1/36 \n Verifying : libblockdev-swap-2.18-5.el7.x86_64 2/36 \n Verifying : librhsm-0.0.3-3.el7_9.x86_64 3/36 \n Verifying : libblockdev-lvm-2.18-5.el7.x86_64 4/36 \n Verifying : lsof-4.87-6.el7.x86_64 5/36 \n Verifying : libblockdev-mdraid-2.18-5.el7.x86_64 6/36 \n Verifying : libdnf-0.22.5-2.el7_9.x86_64 7/36 \n Verifying : python-enum34-1.0.4-1.el7.noarch 8/36 \n Verifying : 1:blivet3-data-3.1.3-3.el7.noarch 9/36 \n Verifying : dmraid-events-1.0.0.rc16-28.el7.x86_64 10/36 \n Verifying : python2-blockdev-2.18-5.el7.x86_64 11/36 \n Verifying : libmodulemd-1.6.3-1.el7.x86_64 12/36 \n Verifying : librepo-1.8.1-8.el7_9.x86_64 13/36 \n Verifying : libblockdev-dm-2.18-5.el7.x86_64 14/36 \n Verifying : json-glib-1.4.2-2.el7.x86_64 15/36 \n Verifying : libaio-0.3.109-13.el7.x86_64 16/36 \n Verifying : 7:lvm2-libs-2.02.187-6.el7_9.5.x86_64 17/36 \n Verifying : python2-hawkey-0.22.5-2.el7_9.x86_64 18/36 \n Verifying : python2-bytesize-1.2-1.el7.x86_64 19/36 \n Verifying : libblockdev-2.18-5.el7.x86_64 20/36 \n Verifying : libreport-filesystem-2.1.11-53.el7.x86_64 21/36 \n Verifying : libbytesize-1.2-1.el7.x86_64 22/36 \n Verifying : 7:device-mapper-event-libs-1.02.170-6.el7_9.5.x86_64 23/36 \n Verifying : python2-libdnf-0.22.5-2.el7_9.x86_64 24/36 \n Verifying : 7:lvm2-2.02.187-6.el7_9.5.x86_64 25/36 \n Verifying : libblockdev-utils-2.18-5.el7.x86_64 26/36 \n Verifying : mpfr-3.1.1-4.el7.x86_64 27/36 \n Verifying : volume_key-libs-0.3.9-9.el7.x86_64 28/36 \n Verifying : libsolv-0.6.34-4.el7.x86_64 29/36 \n Verifying : device-mapper-persistent-data-0.8.5-3.el7_9.2.x86_64 30/36 \n Verifying : 1:python2-blivet3-3.1.3-3.el7.noarch 31/36 \n Verifying : dmraid-1.0.0.rc16-28.el7.x86_64 32/36 \n Verifying : mdadm-4.1-9.el7_9.x86_64 33/36 \n Verifying : sgpio-1.2.0.10-13.el7.x86_64 34/36 \n Verifying : libblockdev-crypto-2.18-5.el7.x86_64 35/36 \n Verifying : 1:pyparted-3.9-15.el7.x86_64 36/36 \n\nInstalled:\n libblockdev-crypto.x86_64 0:2.18-5.el7 libblockdev-dm.x86_64 0:2.18-5.el7 \n libblockdev-lvm.x86_64 0:2.18-5.el7 libblockdev-mdraid.x86_64 0:2.18-5.el7\n libblockdev-swap.x86_64 0:2.18-5.el7 python-enum34.noarch 0:1.0.4-1.el7 \n python2-blivet3.noarch 1:3.1.3-3.el7 \n\nDependency Installed:\n blivet3-data.noarch 1:3.1.3-3.el7 \n device-mapper-event.x86_64 7:1.02.170-6.el7_9.5 \n device-mapper-event-libs.x86_64 7:1.02.170-6.el7_9.5 \n device-mapper-persistent-data.x86_64 0:0.8.5-3.el7_9.2 \n dmraid.x86_64 0:1.0.0.rc16-28.el7 \n dmraid-events.x86_64 0:1.0.0.rc16-28.el7 \n json-glib.x86_64 0:1.4.2-2.el7 \n libaio.x86_64 0:0.3.109-13.el7 \n libblockdev.x86_64 0:2.18-5.el7 \n libblockdev-utils.x86_64 0:2.18-5.el7 \n libbytesize.x86_64 0:1.2-1.el7 \n libdnf.x86_64 0:0.22.5-2.el7_9 \n libmodulemd.x86_64 0:1.6.3-1.el7 \n librepo.x86_64 0:1.8.1-8.el7_9 \n libreport-filesystem.x86_64 0:2.1.11-53.el7 \n librhsm.x86_64 0:0.0.3-3.el7_9 \n libsolv.x86_64 0:0.6.34-4.el7 \n lsof.x86_64 0:4.87-6.el7 \n lvm2.x86_64 7:2.02.187-6.el7_9.5 \n lvm2-libs.x86_64 7:2.02.187-6.el7_9.5 \n mdadm.x86_64 0:4.1-9.el7_9 \n mpfr.x86_64 0:3.1.1-4.el7 \n pyparted.x86_64 1:3.9-15.el7 \n python2-blockdev.x86_64 0:2.18-5.el7 \n python2-bytesize.x86_64 0:1.2-1.el7 \n python2-hawkey.x86_64 0:0.22.5-2.el7_9 \n python2-libdnf.x86_64 0:0.22.5-2.el7_9 \n sgpio.x86_64 0:1.2.0.10-13.el7 \n volume_key-libs.x86_64 0:0.3.9-9.el7 \n\nComplete!\n" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 17:58:22 +0000 (0:00:08.938) 0:00:12.894 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 17:58:22 +0000 (0:00:00.036) 0:00:12.931 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 17:58:22 +0000 (0:00:00.033) 0:00:12.965 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 17:58:23 +0000 (0:00:00.648) 0:00:13.613 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 17:58:23 +0000 (0:00:00.045) 0:00:13.658 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 17:58:23 +0000 (0:00:00.031) 0:00:13.689 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 17:58:23 +0000 (0:00:00.035) 0:00:13.725 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 17:58:23 +0000 (0:00:00.032) 0:00:13.757 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 17:58:23 +0000 (0:00:00.520) 0:00:14.278 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 17:58:25 +0000 (0:00:01.141) 0:00:15.420 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 17:58:25 +0000 (0:00:00.058) 0:00:15.479 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 17:58:25 +0000 (0:00:00.022) 0:00:15.501 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 17:58:25 +0000 (0:00:00.457) 0:00:15.958 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 17:58:25 +0000 (0:00:00.036) 0:00:15.995 ********* TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 17:58:25 +0000 (0:00:00.022) 0:00:16.017 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 17:58:25 +0000 (0:00:00.034) 0:00:16.052 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 17:58:25 +0000 (0:00:00.034) 0:00:16.087 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 17:58:25 +0000 (0:00:00.036) 0:00:16.123 ********* TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 17:58:25 +0000 (0:00:00.034) 0:00:16.157 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 17:58:25 +0000 (0:00:00.021) 0:00:16.178 ********* TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 17:58:25 +0000 (0:00:00.035) 0:00:16.214 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 17:58:25 +0000 (0:00:00.022) 0:00:16.236 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426300.1647246, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658201031.524, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 70, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658200515.884, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744071677828413", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 17:58:26 +0000 (0:00:00.444) 0:00:16.680 ********* TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 17:58:26 +0000 (0:00:00.021) 0:00:16.702 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:14 Thursday 21 July 2022 17:58:27 +0000 (0:00:00.866) 0:00:17.568 ********* included: /tmp/tmptomayb7j/tests/storage/get_unused_disk.yml for /cache/rhel-7.qcow2 TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/get_unused_disk.yml:2 Thursday 21 July 2022 17:58:27 +0000 (0:00:00.069) 0:00:17.638 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "disks": [ "nvme1n1" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/get_unused_disk.yml:9 Thursday 21 July 2022 17:58:27 +0000 (0:00:00.424) 0:00:18.062 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "unused_disks": [ "nvme1n1" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmptomayb7j/tests/storage/get_unused_disk.yml:14 Thursday 21 July 2022 17:58:27 +0000 (0:00:00.035) 0:00:18.098 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/get_unused_disk.yml:19 Thursday 21 July 2022 17:58:27 +0000 (0:00:00.037) 0:00:18.136 ********* ok: [/cache/rhel-7.qcow2] => { "unused_disks": [ "nvme1n1" ] } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:24 Thursday 21 July 2022 17:58:27 +0000 (0:00:00.035) 0:00:18.172 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 17:58:27 +0000 (0:00:00.037) 0:00:18.209 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 17:58:27 +0000 (0:00:00.032) 0:00:18.242 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 17:58:28 +0000 (0:00:00.403) 0:00:18.645 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 17:58:28 +0000 (0:00:00.063) 0:00:18.708 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 17:58:28 +0000 (0:00:00.034) 0:00:18.743 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 17:58:28 +0000 (0:00:00.036) 0:00:18.780 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 17:58:28 +0000 (0:00:00.055) 0:00:18.835 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 17:58:28 +0000 (0:00:00.020) 0:00:18.856 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 17:58:29 +0000 (0:00:00.650) 0:00:19.506 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 17:58:29 +0000 (0:00:00.037) 0:00:19.543 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": [ { "disks": [ "nvme1n1" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 17:58:29 +0000 (0:00:00.039) 0:00:19.583 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 17:58:30 +0000 (0:00:00.906) 0:00:20.490 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 17:58:30 +0000 (0:00:00.046) 0:00:20.536 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 17:58:30 +0000 (0:00:00.068) 0:00:20.605 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 17:58:30 +0000 (0:00:00.040) 0:00:20.646 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 17:58:30 +0000 (0:00:00.033) 0:00:20.679 ********* changed: [/cache/rhel-7.qcow2] => { "changed": true, "changes": { "installed": [ "cryptsetup" ] }, "rc": 0, "results": [ "Loaded plugins: search-disabled-repos\nResolving Dependencies\n--> Running transaction check\n---> Package cryptsetup.x86_64 0:2.0.3-6.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nInstalling:\n cryptsetup x86_64 2.0.3-6.el7 rhel 154 k\n\nTransaction Summary\n================================================================================\nInstall 1 Package\n\nTotal download size: 154 k\nInstalled size: 354 k\nDownloading packages:\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : cryptsetup-2.0.3-6.el7.x86_64 1/1 \n Verifying : cryptsetup-2.0.3-6.el7.x86_64 1/1 \n\nInstalled:\n cryptsetup.x86_64 0:2.0.3-6.el7 \n\nComplete!\n" ] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 17:58:31 +0000 (0:00:01.479) 0:00:22.158 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 17:58:32 +0000 (0:00:00.971) 0:00:23.130 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 17:58:32 +0000 (0:00:00.057) 0:00:23.187 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 17:58:32 +0000 (0:00:00.022) 0:00:23.210 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : failed message] ********************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:86 Thursday 21 July 2022 17:58:33 +0000 (0:00:00.858) 0:00:24.068 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': True, 'pools': [], 'volumes': [{'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 10737418240, 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, 'encryption': True, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'type': 'disk', 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, 'raid_spare_count': None, 'name': 'foo', 'cache_mode': None, 'cache_devices': [], 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': None, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'thin_pool_size': None, 'fs_create_options': ''}], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "encrypted volume 'foo' missing key/password", '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 17:58:33 +0000 (0:00:00.051) 0:00:24.120 ********* TASK [Check that we failed in the role] **************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:40 Thursday 21 July 2022 17:58:33 +0000 (0:00:00.031) 0:00:24.152 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the keyless luks test] ****************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:46 Thursday 21 July 2022 17:58:33 +0000 (0:00:00.038) 0:00:24.190 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:52 Thursday 21 July 2022 17:58:33 +0000 (0:00:00.047) 0:00:24.238 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 17:58:33 +0000 (0:00:00.036) 0:00:24.275 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 17:58:33 +0000 (0:00:00.035) 0:00:24.310 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 17:58:34 +0000 (0:00:00.409) 0:00:24.720 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 17:58:34 +0000 (0:00:00.090) 0:00:24.810 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 17:58:34 +0000 (0:00:00.034) 0:00:24.844 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 17:58:34 +0000 (0:00:00.033) 0:00:24.878 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 17:58:34 +0000 (0:00:00.078) 0:00:24.956 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 17:58:34 +0000 (0:00:00.019) 0:00:24.976 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 17:58:35 +0000 (0:00:00.631) 0:00:25.608 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 17:58:35 +0000 (0:00:00.035) 0:00:25.644 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": [ { "disks": [ "nvme1n1" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 17:58:35 +0000 (0:00:00.045) 0:00:25.690 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 17:58:36 +0000 (0:00:00.840) 0:00:26.530 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 17:58:36 +0000 (0:00:00.048) 0:00:26.579 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 17:58:36 +0000 (0:00:00.036) 0:00:26.615 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 17:58:36 +0000 (0:00:00.038) 0:00:26.653 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 17:58:36 +0000 (0:00:00.035) 0:00:26.688 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 17:58:36 +0000 (0:00:00.527) 0:00:27.215 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 17:58:37 +0000 (0:00:00.982) 0:00:28.198 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 17:58:37 +0000 (0:00:00.058) 0:00:28.257 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 17:58:37 +0000 (0:00:00.023) 0:00:28.280 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1", "name": "luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 17:58:45 +0000 (0:00:07.345) 0:00:35.625 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 17:58:45 +0000 (0:00:00.037) 0:00:35.662 ********* TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 17:58:45 +0000 (0:00:00.022) 0:00:35.685 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1", "name": "luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 17:58:45 +0000 (0:00:00.037) 0:00:35.723 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 17:58:45 +0000 (0:00:00.083) 0:00:35.806 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 17:58:45 +0000 (0:00:00.070) 0:00:35.877 ********* TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 17:58:45 +0000 (0:00:00.071) 0:00:35.949 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 17:58:46 +0000 (0:00:00.711) 0:00:36.660 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 17:58:46 +0000 (0:00:00.474) 0:00:37.135 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 17:58:47 +0000 (0:00:00.461) 0:00:37.596 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426300.1647246, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658201031.524, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 70, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658200515.884, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744071677828413", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 17:58:47 +0000 (0:00:00.315) 0:00:37.911 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'present', 'password': '-', 'name': 'luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f', 'backing_device': '/dev/nvme1n1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/nvme1n1", "name": "luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 17:58:47 +0000 (0:00:00.453) 0:00:38.365 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:64 Thursday 21 July 2022 17:58:49 +0000 (0:00:01.851) 0:00:40.216 ********* included: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:1 Thursday 21 July 2022 17:58:49 +0000 (0:00:00.038) 0:00:40.255 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:6 Thursday 21 July 2022 17:58:49 +0000 (0:00:00.039) 0:00:40.294 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:14 Thursday 21 July 2022 17:58:49 +0000 (0:00:00.049) 0:00:40.344 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "size": "10G", "type": "crypt", "uuid": "165c25b4-f536-4fbb-a709-cba65b9b12f4" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "a2fa758f-23c4-4089-a29f-6dc649e96a0f" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-17-57-57-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:19 Thursday 21 July 2022 17:58:50 +0000 (0:00:00.448) 0:00:40.792 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003292", "end": "2022-07-21 13:58:50.776456", "rc": 0, "start": "2022-07-21 13:58:50.773164" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 /dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:24 Thursday 21 July 2022 17:58:50 +0000 (0:00:00.436) 0:00:41.228 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003490", "end": "2022-07-21 13:58:51.074577", "failed_when_result": false, "rc": 0, "start": "2022-07-21 13:58:51.071087" } STDOUT: luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f /dev/nvme1n1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:33 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.301) 0:00:41.530 ********* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:43 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.021) 0:00:41.552 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/nvme1n1', 'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', '_device': '/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f', 'size': 10737418240, 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': True, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'type': 'disk', 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f', 'raid_spare_count': None, 'name': 'foo', '_raw_kernel_device': '/dev/nvme1n1', 'cache_mode': None, 'cache_devices': [], 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': None, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'thin_pool_size': None, 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:2 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.056) 0:00:41.608 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:10 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.080) 0:00:41.689 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:6 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.128) 0:00:41.817 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:10 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.041) 0:00:41.859 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2610120, "block_size": 4096, "block_total": 2618368, "block_used": 8248, "device": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "fstype": "xfs", "inode_available": 5241853, "inode_total": 5241856, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10691051520, "size_total": 10724835328, "uuid": "165c25b4-f536-4fbb-a709-cba65b9b12f4" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2610120, "block_size": 4096, "block_total": 2618368, "block_used": 8248, "device": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "fstype": "xfs", "inode_available": 5241853, "inode_total": 5241856, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10691051520, "size_total": 10724835328, "uuid": "165c25b4-f536-4fbb-a709-cba65b9b12f4" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:20 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.057) 0:00:41.917 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:29 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.054) 0:00:41.971 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:37 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.052) 0:00:42.023 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:46 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.052) 0:00:42.076 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:50 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.026) 0:00:42.103 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:55 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.025) 0:00:42.128 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:65 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.024) 0:00:42.153 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.035) 0:00:42.188 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:12 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.058) 0:00:42.247 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:19 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.046) 0:00:42.293 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.047) 0:00:42.340 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:34 Thursday 21 July 2022 17:58:51 +0000 (0:00:00.039) 0:00:42.380 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:4 Thursday 21 July 2022 17:58:52 +0000 (0:00:00.032) 0:00:42.412 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:10 Thursday 21 July 2022 17:58:52 +0000 (0:00:00.037) 0:00:42.449 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:4 Thursday 21 July 2022 17:58:52 +0000 (0:00:00.042) 0:00:42.492 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426324.9817245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426324.9817245, "dev": 5, "device_type": 66305, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 10779, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1658426324.9817245, "nlink": 1, "path": "/dev/nvme1n1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:10 Thursday 21 July 2022 17:58:52 +0000 (0:00:00.322) 0:00:42.814 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:15 Thursday 21 July 2022 17:58:52 +0000 (0:00:00.037) 0:00:42.852 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:21 Thursday 21 July 2022 17:58:52 +0000 (0:00:00.037) 0:00:42.889 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:25 Thursday 21 July 2022 17:58:52 +0000 (0:00:00.035) 0:00:42.924 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:30 Thursday 21 July 2022 17:58:52 +0000 (0:00:00.023) 0:00:42.948 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 17:58:52 +0000 (0:00:00.034) 0:00:42.983 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426325.1187246, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426325.1187246, "dev": 5, "device_type": 64512, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36556, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658426325.1187246, "nlink": 1, "path": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 17:58:52 +0000 (0:00:00.321) 0:00:43.305 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 17:58:53 +0000 (0:00:00.531) 0:00:43.837 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/nvme1n1" ], "delta": "0:00:00.109633", "end": "2022-07-21 13:58:53.795027", "rc": 0, "start": "2022-07-21 13:58:53.685394" } STDOUT: LUKS header information for /dev/nvme1n1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 5e 3c ae 13 b7 91 4d f8 ef 5a 0e ef a7 2f 3a 6e 62 67 ec b1 MK salt: 38 50 02 19 fd 20 5d 44 f2 02 2b ea 2b 13 a4 5a 98 97 de d3 d7 ac fb 98 fd a7 74 3e b8 dc 52 e1 MK iterations: 22598 UUID: a2fa758f-23c4-4089-a29f-6dc649e96a0f Key Slot 0: ENABLED Iterations: 362076 Salt: 2b 20 a3 00 43 c9 19 79 21 95 d3 88 cd e0 79 61 8c 53 22 e8 c0 75 a0 47 52 0a 7f 40 96 e7 5f cc Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 17:58:53 +0000 (0:00:00.414) 0:00:44.251 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:27 Thursday 21 July 2022 17:58:53 +0000 (0:00:00.042) 0:00:44.293 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:33 Thursday 21 July 2022 17:58:53 +0000 (0:00:00.091) 0:00:44.384 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:39 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.076) 0:00:44.461 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.039) 0:00:44.500 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:50 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.024) 0:00:44.524 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:56 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.021) 0:00:44.546 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:62 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.021) 0:00:44.567 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f /dev/nvme1n1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.090) 0:00:44.657 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:72 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.045) 0:00:44.702 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:78 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.045) 0:00:44.748 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:84 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.045) 0:00:44.794 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:90 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.045) 0:00:44.839 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:7 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.030) 0:00:44.869 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:13 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.034) 0:00:44.904 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:17 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.032) 0:00:44.936 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:21 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.031) 0:00:44.968 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:25 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.031) 0:00:45.000 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:31 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.038) 0:00:45.039 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:37 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.035) 0:00:45.075 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:3 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.037) 0:00:45.112 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:9 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.022) 0:00:45.135 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:15 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.036) 0:00:45.172 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:20 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.034) 0:00:45.206 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:25 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.033) 0:00:45.239 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:28 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.040) 0:00:45.279 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:31 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.042) 0:00:45.322 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:36 Thursday 21 July 2022 17:58:54 +0000 (0:00:00.037) 0:00:45.359 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:39 Thursday 21 July 2022 17:58:55 +0000 (0:00:00.035) 0:00:45.395 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:44 Thursday 21 July 2022 17:58:55 +0000 (0:00:00.037) 0:00:45.433 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:47 Thursday 21 July 2022 17:58:55 +0000 (0:00:00.036) 0:00:45.470 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:50 Thursday 21 July 2022 17:58:55 +0000 (0:00:00.032) 0:00:45.503 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:6 Thursday 21 July 2022 17:58:55 +0000 (0:00:00.021) 0:00:45.524 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:14 Thursday 21 July 2022 17:58:55 +0000 (0:00:00.021) 0:00:45.546 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:17 Thursday 21 July 2022 17:58:55 +0000 (0:00:00.022) 0:00:45.569 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:22 Thursday 21 July 2022 17:58:55 +0000 (0:00:00.026) 0:00:45.595 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:26 Thursday 21 July 2022 17:58:55 +0000 (0:00:00.030) 0:00:45.626 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:32 Thursday 21 July 2022 17:58:55 +0000 (0:00:00.024) 0:00:45.650 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:36 Thursday 21 July 2022 17:58:55 +0000 (0:00:00.027) 0:00:45.678 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:16 Thursday 21 July 2022 17:58:55 +0000 (0:00:00.023) 0:00:45.701 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:53 Thursday 21 July 2022 17:58:55 +0000 (0:00:00.036) 0:00:45.737 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/create-test-file.yml:10 Thursday 21 July 2022 17:58:55 +0000 (0:00:00.036) 0:00:45.774 ********* changed: [/cache/rhel-7.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:70 Thursday 21 July 2022 17:58:55 +0000 (0:00:00.473) 0:00:46.247 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 17:58:55 +0000 (0:00:00.038) 0:00:46.286 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 17:58:55 +0000 (0:00:00.073) 0:00:46.359 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 17:58:56 +0000 (0:00:00.400) 0:00:46.760 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 17:58:56 +0000 (0:00:00.063) 0:00:46.823 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 17:58:56 +0000 (0:00:00.035) 0:00:46.859 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 17:58:56 +0000 (0:00:00.033) 0:00:46.892 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 17:58:56 +0000 (0:00:00.054) 0:00:46.947 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 17:58:56 +0000 (0:00:00.021) 0:00:46.968 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 17:58:57 +0000 (0:00:00.714) 0:00:47.683 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 17:58:57 +0000 (0:00:00.038) 0:00:47.721 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 17:58:57 +0000 (0:00:00.039) 0:00:47.761 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 17:58:58 +0000 (0:00:00.955) 0:00:48.716 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 17:58:58 +0000 (0:00:00.045) 0:00:48.762 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 17:58:58 +0000 (0:00:00.035) 0:00:48.797 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 17:58:58 +0000 (0:00:00.038) 0:00:48.836 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 17:58:58 +0000 (0:00:00.038) 0:00:48.874 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 17:58:59 +0000 (0:00:00.542) 0:00:49.416 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 17:59:00 +0000 (0:00:01.014) 0:00:50.431 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 17:59:00 +0000 (0:00:00.090) 0:00:50.522 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 17:59:00 +0000 (0:00:00.021) 0:00:50.543 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : failed message] ********************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:86 Thursday 21 July 2022 17:59:01 +0000 (0:00:01.006) 0:00:51.550 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': True, 'pools': [], 'volumes': [{'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 10735321088, 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'type': 'disk', 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, 'raid_spare_count': None, 'name': 'foo', 'cache_mode': None, 'cache_devices': [], 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': None, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'thin_pool_size': None, 'fs_create_options': ''}], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "cannot remove existing formatting on device 'luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f' in safe mode due to encryption removal", '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 17:59:01 +0000 (0:00:00.041) 0:00:51.591 ********* TASK [Check that we failed in the role] **************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:85 Thursday 21 July 2022 17:59:01 +0000 (0:00:00.020) 0:00:51.612 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:91 Thursday 21 July 2022 17:59:01 +0000 (0:00:00.036) 0:00:51.649 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml:10 Thursday 21 July 2022 17:59:01 +0000 (0:00:00.049) 0:00:51.698 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426335.7897246, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658426335.7897246, "dev": 64512, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1658426335.7897246, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744073496964868", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml:15 Thursday 21 July 2022 17:59:01 +0000 (0:00:00.339) 0:00:52.037 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:98 Thursday 21 July 2022 17:59:01 +0000 (0:00:00.050) 0:00:52.088 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 17:59:01 +0000 (0:00:00.039) 0:00:52.127 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 17:59:01 +0000 (0:00:00.033) 0:00:52.160 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 17:59:02 +0000 (0:00:00.419) 0:00:52.580 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 17:59:02 +0000 (0:00:00.060) 0:00:52.641 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 17:59:02 +0000 (0:00:00.037) 0:00:52.678 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 17:59:02 +0000 (0:00:00.035) 0:00:52.714 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 17:59:02 +0000 (0:00:00.054) 0:00:52.769 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 17:59:02 +0000 (0:00:00.019) 0:00:52.788 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 17:59:03 +0000 (0:00:00.745) 0:00:53.534 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 17:59:03 +0000 (0:00:00.096) 0:00:53.631 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 17:59:03 +0000 (0:00:00.043) 0:00:53.674 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 17:59:04 +0000 (0:00:00.953) 0:00:54.627 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 17:59:04 +0000 (0:00:00.047) 0:00:54.675 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 17:59:04 +0000 (0:00:00.036) 0:00:54.712 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 17:59:04 +0000 (0:00:00.041) 0:00:54.753 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 17:59:04 +0000 (0:00:00.034) 0:00:54.787 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 17:59:04 +0000 (0:00:00.546) 0:00:55.334 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 17:59:05 +0000 (0:00:01.005) 0:00:56.340 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 17:59:06 +0000 (0:00:00.056) 0:00:56.396 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 17:59:06 +0000 (0:00:00.020) 0:00:56.417 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1", "name": "luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=3b1fde90-821a-435e-a750-30060b766101", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/nvme1n1", "_kernel_device": "/dev/nvme1n1", "_mount_id": "UUID=3b1fde90-821a-435e-a750-30060b766101", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10735321088, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 17:59:07 +0000 (0:00:01.337) 0:00:57.754 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 17:59:07 +0000 (0:00:00.047) 0:00:57.802 ********* TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 17:59:07 +0000 (0:00:00.023) 0:00:57.825 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1", "name": "luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=3b1fde90-821a-435e-a750-30060b766101", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/nvme1n1", "_kernel_device": "/dev/nvme1n1", "_mount_id": "UUID=3b1fde90-821a-435e-a750-30060b766101", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10735321088, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 17:59:07 +0000 (0:00:00.053) 0:00:57.879 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 17:59:07 +0000 (0:00:00.039) 0:00:57.918 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/nvme1n1", "_kernel_device": "/dev/nvme1n1", "_mount_id": "UUID=3b1fde90-821a-435e-a750-30060b766101", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10735321088, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 17:59:07 +0000 (0:00:00.039) 0:00:57.958 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 17:59:07 +0000 (0:00:00.347) 0:00:58.305 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 17:59:08 +0000 (0:00:00.472) 0:00:58.778 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': 'UUID=3b1fde90-821a-435e-a750-30060b766101', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=3b1fde90-821a-435e-a750-30060b766101", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=3b1fde90-821a-435e-a750-30060b766101" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 17:59:08 +0000 (0:00:00.362) 0:00:59.140 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 17:59:09 +0000 (0:00:00.469) 0:00:59.610 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426331.0737245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "6b3a93840e1d4394903c2794d351df0592f7820c", "ctime": 1658426327.9047246, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 12585524, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1658426327.9037247, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 57, "uid": 0, "version": "1615048679", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 17:59:09 +0000 (0:00:00.356) 0:00:59.966 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'absent', 'password': '-', 'name': 'luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f', 'backing_device': '/dev/nvme1n1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/nvme1n1", "name": "luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 17:59:09 +0000 (0:00:00.350) 0:01:00.317 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:111 Thursday 21 July 2022 17:59:10 +0000 (0:00:00.901) 0:01:01.218 ********* included: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:1 Thursday 21 July 2022 17:59:10 +0000 (0:00:00.039) 0:01:01.257 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:6 Thursday 21 July 2022 17:59:10 +0000 (0:00:00.037) 0:01:01.295 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/nvme1n1", "_kernel_device": "/dev/nvme1n1", "_mount_id": "UUID=3b1fde90-821a-435e-a750-30060b766101", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10735321088, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:14 Thursday 21 July 2022 17:59:10 +0000 (0:00:00.053) 0:01:01.348 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "xfs", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "3b1fde90-821a-435e-a750-30060b766101" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-17-57-57-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:19 Thursday 21 July 2022 17:59:12 +0000 (0:00:01.315) 0:01:02.663 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003439", "end": "2022-07-21 13:59:12.519440", "rc": 0, "start": "2022-07-21 13:59:12.516001" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 UUID=3b1fde90-821a-435e-a750-30060b766101 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:24 Thursday 21 July 2022 17:59:12 +0000 (0:00:00.311) 0:01:02.975 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003879", "end": "2022-07-21 13:59:12.841417", "failed_when_result": false, "rc": 0, "start": "2022-07-21 13:59:12.837538" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:33 Thursday 21 July 2022 17:59:12 +0000 (0:00:00.322) 0:01:03.298 ********* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:43 Thursday 21 July 2022 17:59:12 +0000 (0:00:00.022) 0:01:03.321 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/nvme1n1', 'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', '_device': '/dev/nvme1n1', 'size': 10735321088, 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/nvme1n1', 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'type': 'disk', 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': 'UUID=3b1fde90-821a-435e-a750-30060b766101', 'raid_spare_count': None, 'name': 'foo', '_raw_kernel_device': '/dev/nvme1n1', 'cache_mode': None, 'cache_devices': [], 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': None, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'thin_pool_size': None, 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:2 Thursday 21 July 2022 17:59:12 +0000 (0:00:00.068) 0:01:03.389 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:10 Thursday 21 July 2022 17:59:13 +0000 (0:00:00.053) 0:01:03.443 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:6 Thursday 21 July 2022 17:59:13 +0000 (0:00:00.078) 0:01:03.521 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/nvme1n1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:10 Thursday 21 July 2022 17:59:13 +0000 (0:00:00.039) 0:01:03.561 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2610632, "block_size": 4096, "block_total": 2618880, "block_used": 8248, "device": "/dev/nvme1n1", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10693148672, "size_total": 10726932480, "uuid": "3b1fde90-821a-435e-a750-30060b766101" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2610632, "block_size": 4096, "block_total": 2618880, "block_used": 8248, "device": "/dev/nvme1n1", "fstype": "xfs", "inode_available": 5242877, "inode_total": 5242880, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10693148672, "size_total": 10726932480, "uuid": "3b1fde90-821a-435e-a750-30060b766101" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:20 Thursday 21 July 2022 17:59:13 +0000 (0:00:00.053) 0:01:03.614 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:29 Thursday 21 July 2022 17:59:13 +0000 (0:00:00.050) 0:01:03.664 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:37 Thursday 21 July 2022 17:59:13 +0000 (0:00:00.050) 0:01:03.715 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:46 Thursday 21 July 2022 17:59:13 +0000 (0:00:00.056) 0:01:03.772 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:50 Thursday 21 July 2022 17:59:13 +0000 (0:00:00.025) 0:01:03.798 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:55 Thursday 21 July 2022 17:59:13 +0000 (0:00:00.025) 0:01:03.823 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:65 Thursday 21 July 2022 17:59:13 +0000 (0:00:00.022) 0:01:03.845 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 17:59:13 +0000 (0:00:00.076) 0:01:03.921 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=3b1fde90-821a-435e-a750-30060b766101 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:12 Thursday 21 July 2022 17:59:13 +0000 (0:00:00.063) 0:01:03.985 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:19 Thursday 21 July 2022 17:59:13 +0000 (0:00:00.084) 0:01:04.069 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 17:59:13 +0000 (0:00:00.085) 0:01:04.155 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:34 Thursday 21 July 2022 17:59:13 +0000 (0:00:00.037) 0:01:04.193 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:4 Thursday 21 July 2022 17:59:13 +0000 (0:00:00.081) 0:01:04.274 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:10 Thursday 21 July 2022 17:59:13 +0000 (0:00:00.076) 0:01:04.351 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:4 Thursday 21 July 2022 17:59:14 +0000 (0:00:00.079) 0:01:04.430 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426347.2647245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426347.2647245, "dev": 5, "device_type": 66305, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 10779, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1658426347.2647245, "nlink": 1, "path": "/dev/nvme1n1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:10 Thursday 21 July 2022 17:59:14 +0000 (0:00:00.318) 0:01:04.748 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:15 Thursday 21 July 2022 17:59:14 +0000 (0:00:00.040) 0:01:04.789 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:21 Thursday 21 July 2022 17:59:14 +0000 (0:00:00.039) 0:01:04.829 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:25 Thursday 21 July 2022 17:59:14 +0000 (0:00:00.036) 0:01:04.865 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:30 Thursday 21 July 2022 17:59:14 +0000 (0:00:00.022) 0:01:04.888 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 17:59:14 +0000 (0:00:00.036) 0:01:04.925 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 17:59:14 +0000 (0:00:00.023) 0:01:04.948 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.552) 0:01:05.501 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.024) 0:01:05.525 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:27 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.024) 0:01:05.550 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:33 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.053) 0:01:05.603 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:39 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.022) 0:01:05.626 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.023) 0:01:05.649 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:50 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.022) 0:01:05.672 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:56 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.024) 0:01:05.696 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:62 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.022) 0:01:05.718 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.051) 0:01:05.770 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:72 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.049) 0:01:05.819 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:78 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.037) 0:01:05.856 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:84 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.037) 0:01:05.894 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:90 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.039) 0:01:05.933 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:7 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.031) 0:01:05.964 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:13 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.033) 0:01:05.998 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:17 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.035) 0:01:06.033 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:21 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.037) 0:01:06.071 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:25 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.035) 0:01:06.107 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:31 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.038) 0:01:06.146 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:37 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.036) 0:01:06.183 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:3 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.039) 0:01:06.222 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:9 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.024) 0:01:06.246 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:15 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.034) 0:01:06.281 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:20 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.037) 0:01:06.318 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:25 Thursday 21 July 2022 17:59:15 +0000 (0:00:00.045) 0:01:06.364 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:28 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.045) 0:01:06.410 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:31 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.038) 0:01:06.448 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:36 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.036) 0:01:06.485 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:39 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.047) 0:01:06.532 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:44 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.039) 0:01:06.572 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:47 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.085) 0:01:06.657 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:50 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.073) 0:01:06.731 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:6 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.025) 0:01:06.756 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:14 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.025) 0:01:06.782 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:17 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.027) 0:01:06.810 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:22 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.059) 0:01:06.869 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:26 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.024) 0:01:06.893 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:32 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.025) 0:01:06.919 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:36 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.024) 0:01:06.943 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:16 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.023) 0:01:06.966 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:53 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.033) 0:01:06.999 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/create-test-file.yml:10 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.035) 0:01:07.034 ********* changed: [/cache/rhel-7.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:117 Thursday 21 July 2022 17:59:16 +0000 (0:00:00.347) 0:01:07.382 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 17:59:17 +0000 (0:00:00.036) 0:01:07.418 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 17:59:17 +0000 (0:00:00.034) 0:01:07.453 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 17:59:17 +0000 (0:00:00.414) 0:01:07.867 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 17:59:17 +0000 (0:00:00.060) 0:01:07.928 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 17:59:17 +0000 (0:00:00.033) 0:01:07.962 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 17:59:17 +0000 (0:00:00.032) 0:01:07.995 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 17:59:17 +0000 (0:00:00.053) 0:01:08.048 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 17:59:17 +0000 (0:00:00.020) 0:01:08.068 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 17:59:18 +0000 (0:00:00.711) 0:01:08.780 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 17:59:18 +0000 (0:00:00.041) 0:01:08.822 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": [ { "disks": [ "nvme1n1" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 17:59:18 +0000 (0:00:00.038) 0:01:08.860 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 17:59:19 +0000 (0:00:00.855) 0:01:09.716 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 17:59:19 +0000 (0:00:00.045) 0:01:09.762 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 17:59:19 +0000 (0:00:00.035) 0:01:09.798 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 17:59:19 +0000 (0:00:00.041) 0:01:09.839 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 17:59:19 +0000 (0:00:00.034) 0:01:09.873 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 17:59:20 +0000 (0:00:00.564) 0:01:10.438 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2da2fa758f\\x2d23c4\\x2d4089\\x2da29f\\x2d6dc649e96a0f.service": { "name": "systemd-cryptsetup@luks\\x2da2fa758f\\x2d23c4\\x2d4089\\x2da29f\\x2d6dc649e96a0f.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 17:59:21 +0000 (0:00:01.079) 0:01:11.517 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2da2fa758f\\x2d23c4\\x2d4089\\x2da29f\\x2d6dc649e96a0f.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 17:59:21 +0000 (0:00:00.059) 0:01:11.577 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2da2fa758f\x2d23c4\x2d4089\x2da29f\x2d6dc649e96a0f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da2fa758f\\x2d23c4\\x2d4089\\x2da29f\\x2d6dc649e96a0f.service", "name": "systemd-cryptsetup@luks\\x2da2fa758f\\x2d23c4\\x2d4089\\x2da29f\\x2d6dc649e96a0f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket systemd-readahead-collect.service dev-nvme1n1.device cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-nvme1n1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f /dev/nvme1n1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-a2fa758f-23c4-4089-a29f-6dc649e96a0f ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2da2fa758f\\x2d23c4\\x2d4089\\x2da29f\\x2d6dc649e96a0f.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2da2fa758f\\x2d23c4\\x2d4089\\x2da29f\\x2d6dc649e96a0f.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2da2fa758f\\x2d23c4\\x2d4089\\x2da29f\\x2d6dc649e96a0f.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-nvme1n1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 17:59:21 +0000 (0:00:00.475) 0:01:12.052 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'nvme1n1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : failed message] ********************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:86 Thursday 21 July 2022 17:59:22 +0000 (0:00:00.866) 0:01:12.919 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': True, 'pools': [], 'volumes': [{'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 10737418240, 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', 'encryption': True, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'type': 'disk', 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, 'raid_spare_count': None, 'name': 'foo', 'cache_mode': None, 'cache_devices': [], 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': None, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'thin_pool_size': None, 'fs_create_options': ''}], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "cannot remove existing formatting on device 'nvme1n1' in safe mode due to adding encryption", '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 17:59:22 +0000 (0:00:00.038) 0:01:12.958 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2da2fa758f\x2d23c4\x2d4089\x2da29f\x2d6dc649e96a0f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da2fa758f\\x2d23c4\\x2d4089\\x2da29f\\x2d6dc649e96a0f.service", "name": "systemd-cryptsetup@luks\\x2da2fa758f\\x2d23c4\\x2d4089\\x2da29f\\x2d6dc649e96a0f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2da2fa758f\\x2d23c4\\x2d4089\\x2da29f\\x2d6dc649e96a0f.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2da2fa758f\\x2d23c4\\x2d4089\\x2da29f\\x2d6dc649e96a0f.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2da2fa758f\\x2d23c4\\x2d4089\\x2da29f\\x2d6dc649e96a0f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:132 Thursday 21 July 2022 17:59:23 +0000 (0:00:00.442) 0:01:13.401 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:138 Thursday 21 July 2022 17:59:23 +0000 (0:00:00.036) 0:01:13.438 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml:10 Thursday 21 July 2022 17:59:23 +0000 (0:00:00.046) 0:01:13.484 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426356.9247246, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658426356.9247246, "dev": 66305, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1658426356.9247246, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744072722244812", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml:15 Thursday 21 July 2022 17:59:23 +0000 (0:00:00.311) 0:01:13.796 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:145 Thursday 21 July 2022 17:59:23 +0000 (0:00:00.044) 0:01:13.841 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 17:59:23 +0000 (0:00:00.040) 0:01:13.881 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 17:59:23 +0000 (0:00:00.033) 0:01:13.915 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 17:59:23 +0000 (0:00:00.422) 0:01:14.337 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 17:59:24 +0000 (0:00:00.063) 0:01:14.401 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 17:59:24 +0000 (0:00:00.033) 0:01:14.434 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 17:59:24 +0000 (0:00:00.033) 0:01:14.468 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 17:59:24 +0000 (0:00:00.055) 0:01:14.523 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 17:59:24 +0000 (0:00:00.026) 0:01:14.550 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 17:59:24 +0000 (0:00:00.681) 0:01:15.231 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 17:59:24 +0000 (0:00:00.080) 0:01:15.311 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": [ { "disks": [ "nvme1n1" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 17:59:24 +0000 (0:00:00.078) 0:01:15.389 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 17:59:25 +0000 (0:00:00.878) 0:01:16.268 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 17:59:25 +0000 (0:00:00.079) 0:01:16.347 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 17:59:25 +0000 (0:00:00.033) 0:01:16.381 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 17:59:26 +0000 (0:00:00.037) 0:01:16.418 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 17:59:26 +0000 (0:00:00.034) 0:01:16.452 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 17:59:26 +0000 (0:00:00.522) 0:01:16.975 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 17:59:27 +0000 (0:00:00.977) 0:01:17.953 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 17:59:27 +0000 (0:00:00.055) 0:01:18.009 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 17:59:27 +0000 (0:00:00.021) 0:01:18.030 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1", "name": "luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=3b1fde90-821a-435e-a750-30060b766101", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 17:59:35 +0000 (0:00:07.385) 0:01:25.415 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 17:59:35 +0000 (0:00:00.035) 0:01:25.451 ********* TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 17:59:35 +0000 (0:00:00.020) 0:01:25.472 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1", "name": "luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=3b1fde90-821a-435e-a750-30060b766101", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 17:59:35 +0000 (0:00:00.039) 0:01:25.511 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 17:59:35 +0000 (0:00:00.034) 0:01:25.545 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 17:59:35 +0000 (0:00:00.036) 0:01:25.582 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': 'UUID=3b1fde90-821a-435e-a750-30060b766101', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=3b1fde90-821a-435e-a750-30060b766101", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=3b1fde90-821a-435e-a750-30060b766101" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 17:59:35 +0000 (0:00:00.333) 0:01:25.915 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 17:59:35 +0000 (0:00:00.479) 0:01:26.394 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 17:59:36 +0000 (0:00:00.362) 0:01:26.756 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 17:59:36 +0000 (0:00:00.475) 0:01:27.232 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426352.8407245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658426349.8417246, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 16561, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658426349.8397245, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072603671010", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 17:59:37 +0000 (0:00:00.367) 0:01:27.599 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'present', 'password': '-', 'name': 'luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1', 'backing_device': '/dev/nvme1n1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/nvme1n1", "name": "luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 17:59:37 +0000 (0:00:00.340) 0:01:27.940 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:158 Thursday 21 July 2022 17:59:38 +0000 (0:00:00.938) 0:01:28.878 ********* included: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:1 Thursday 21 July 2022 17:59:38 +0000 (0:00:00.038) 0:01:28.917 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:6 Thursday 21 July 2022 17:59:38 +0000 (0:00:00.037) 0:01:28.955 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "_raw_device": "/dev/nvme1n1", "_raw_kernel_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:14 Thursday 21 July 2022 17:59:38 +0000 (0:00:00.051) 0:01:29.006 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "size": "10G", "type": "crypt", "uuid": "feb8ff09-5f84-4890-8a88-238aded51447" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "b33e637e-ac54-4d1f-a39e-e9eff8d644f1" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-17-57-57-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:19 Thursday 21 July 2022 17:59:39 +0000 (0:00:01.322) 0:01:30.329 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003426", "end": "2022-07-21 13:59:40.180810", "rc": 0, "start": "2022-07-21 13:59:40.177384" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 /dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:24 Thursday 21 July 2022 17:59:40 +0000 (0:00:00.310) 0:01:30.639 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003378", "end": "2022-07-21 13:59:40.489161", "failed_when_result": false, "rc": 0, "start": "2022-07-21 13:59:40.485783" } STDOUT: luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1 /dev/nvme1n1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:33 Thursday 21 July 2022 17:59:40 +0000 (0:00:00.308) 0:01:30.947 ********* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:43 Thursday 21 July 2022 17:59:40 +0000 (0:00:00.022) 0:01:30.970 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/nvme1n1', 'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', '_device': '/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1', 'size': 10737418240, 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': True, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'type': 'disk', 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1', 'raid_spare_count': None, 'name': 'foo', '_raw_kernel_device': '/dev/nvme1n1', 'cache_mode': None, 'cache_devices': [], 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': None, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'thin_pool_size': None, 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:2 Thursday 21 July 2022 17:59:40 +0000 (0:00:00.058) 0:01:31.029 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:10 Thursday 21 July 2022 17:59:40 +0000 (0:00:00.049) 0:01:31.079 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:6 Thursday 21 July 2022 17:59:40 +0000 (0:00:00.074) 0:01:31.153 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:10 Thursday 21 July 2022 17:59:40 +0000 (0:00:00.070) 0:01:31.223 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2610120, "block_size": 4096, "block_total": 2618368, "block_used": 8248, "device": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "fstype": "xfs", "inode_available": 5241853, "inode_total": 5241856, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10691051520, "size_total": 10724835328, "uuid": "feb8ff09-5f84-4890-8a88-238aded51447" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2610120, "block_size": 4096, "block_total": 2618368, "block_used": 8248, "device": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "fstype": "xfs", "inode_available": 5241853, "inode_total": 5241856, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10691051520, "size_total": 10724835328, "uuid": "feb8ff09-5f84-4890-8a88-238aded51447" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:20 Thursday 21 July 2022 17:59:40 +0000 (0:00:00.058) 0:01:31.282 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:29 Thursday 21 July 2022 17:59:40 +0000 (0:00:00.047) 0:01:31.330 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:37 Thursday 21 July 2022 17:59:41 +0000 (0:00:00.083) 0:01:31.414 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:46 Thursday 21 July 2022 17:59:41 +0000 (0:00:00.054) 0:01:31.468 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:50 Thursday 21 July 2022 17:59:41 +0000 (0:00:00.022) 0:01:31.491 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:55 Thursday 21 July 2022 17:59:41 +0000 (0:00:00.021) 0:01:31.512 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:65 Thursday 21 July 2022 17:59:41 +0000 (0:00:00.024) 0:01:31.537 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 17:59:41 +0000 (0:00:00.070) 0:01:31.607 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:12 Thursday 21 July 2022 17:59:41 +0000 (0:00:00.097) 0:01:31.704 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:19 Thursday 21 July 2022 17:59:41 +0000 (0:00:00.083) 0:01:31.788 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 17:59:41 +0000 (0:00:00.098) 0:01:31.886 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:34 Thursday 21 July 2022 17:59:41 +0000 (0:00:00.038) 0:01:31.925 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:4 Thursday 21 July 2022 17:59:41 +0000 (0:00:00.034) 0:01:31.959 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:10 Thursday 21 July 2022 17:59:41 +0000 (0:00:00.041) 0:01:32.000 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:4 Thursday 21 July 2022 17:59:41 +0000 (0:00:00.039) 0:01:32.040 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426374.7807245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426374.7807245, "dev": 5, "device_type": 66305, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 10779, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1658426374.7807245, "nlink": 1, "path": "/dev/nvme1n1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:10 Thursday 21 July 2022 17:59:41 +0000 (0:00:00.315) 0:01:32.355 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:15 Thursday 21 July 2022 17:59:42 +0000 (0:00:00.039) 0:01:32.395 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:21 Thursday 21 July 2022 17:59:42 +0000 (0:00:00.039) 0:01:32.434 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:25 Thursday 21 July 2022 17:59:42 +0000 (0:00:00.038) 0:01:32.473 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:30 Thursday 21 July 2022 17:59:42 +0000 (0:00:00.022) 0:01:32.495 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 17:59:42 +0000 (0:00:00.036) 0:01:32.532 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426374.9077246, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426374.9077246, "dev": 5, "device_type": 64512, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 53039, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658426374.9077246, "nlink": 1, "path": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 17:59:42 +0000 (0:00:00.316) 0:01:32.849 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 17:59:43 +0000 (0:00:00.567) 0:01:33.416 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/nvme1n1" ], "delta": "0:00:00.034497", "end": "2022-07-21 13:59:43.323191", "rc": 0, "start": "2022-07-21 13:59:43.288694" } STDOUT: LUKS header information for /dev/nvme1n1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 1c cb 75 7b 9c d3 53 2b 93 f5 e9 48 7e 5a eb a1 02 16 cc e5 MK salt: f7 5a 91 04 12 05 65 ad 0a e2 88 42 73 b1 6d 02 2e 92 f0 2c dc 10 a9 cf 60 18 26 33 9b ff 2e f6 MK iterations: 22882 UUID: b33e637e-ac54-4d1f-a39e-e9eff8d644f1 Key Slot 0: ENABLED Iterations: 363582 Salt: 0d 34 00 4e 56 58 f6 99 85 e1 a2 09 57 c4 ac 42 3e b3 58 77 d0 45 69 db ba 02 eb 99 de 17 7a ef Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 17:59:43 +0000 (0:00:00.366) 0:01:33.783 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:27 Thursday 21 July 2022 17:59:43 +0000 (0:00:00.040) 0:01:33.823 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:33 Thursday 21 July 2022 17:59:43 +0000 (0:00:00.049) 0:01:33.872 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:39 Thursday 21 July 2022 17:59:43 +0000 (0:00:00.037) 0:01:33.910 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 17:59:43 +0000 (0:00:00.038) 0:01:33.948 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:50 Thursday 21 July 2022 17:59:43 +0000 (0:00:00.025) 0:01:33.974 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:56 Thursday 21 July 2022 17:59:43 +0000 (0:00:00.025) 0:01:33.999 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:62 Thursday 21 July 2022 17:59:43 +0000 (0:00:00.023) 0:01:34.023 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1 /dev/nvme1n1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 17:59:43 +0000 (0:00:00.049) 0:01:34.072 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:72 Thursday 21 July 2022 17:59:43 +0000 (0:00:00.087) 0:01:34.160 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:78 Thursday 21 July 2022 17:59:43 +0000 (0:00:00.051) 0:01:34.211 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:84 Thursday 21 July 2022 17:59:43 +0000 (0:00:00.047) 0:01:34.259 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:90 Thursday 21 July 2022 17:59:43 +0000 (0:00:00.085) 0:01:34.344 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:7 Thursday 21 July 2022 17:59:43 +0000 (0:00:00.039) 0:01:34.384 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:13 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.041) 0:01:34.425 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:17 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.051) 0:01:34.476 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:21 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.039) 0:01:34.516 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:25 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.036) 0:01:34.553 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:31 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.071) 0:01:34.625 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:37 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.035) 0:01:34.660 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:3 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.038) 0:01:34.699 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:9 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.026) 0:01:34.725 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:15 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.037) 0:01:34.762 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:20 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.037) 0:01:34.800 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:25 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.040) 0:01:34.841 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:28 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.041) 0:01:34.882 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:31 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.044) 0:01:34.927 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:36 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.040) 0:01:34.968 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:39 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.039) 0:01:35.007 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:44 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.043) 0:01:35.051 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:47 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.037) 0:01:35.089 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:50 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.038) 0:01:35.127 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:6 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.023) 0:01:35.151 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:14 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.026) 0:01:35.177 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:17 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.025) 0:01:35.203 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:22 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.025) 0:01:35.228 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:26 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.024) 0:01:35.253 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:32 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.025) 0:01:35.278 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:36 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.024) 0:01:35.303 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:16 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.024) 0:01:35.328 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:53 Thursday 21 July 2022 17:59:44 +0000 (0:00:00.044) 0:01:35.372 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:166 Thursday 21 July 2022 17:59:45 +0000 (0:00:00.037) 0:01:35.410 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 17:59:45 +0000 (0:00:00.037) 0:01:35.447 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 17:59:45 +0000 (0:00:00.036) 0:01:35.484 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 17:59:45 +0000 (0:00:00.410) 0:01:35.894 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 17:59:45 +0000 (0:00:00.061) 0:01:35.956 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 17:59:45 +0000 (0:00:00.036) 0:01:35.992 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 17:59:45 +0000 (0:00:00.033) 0:01:36.026 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 17:59:45 +0000 (0:00:00.058) 0:01:36.085 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 17:59:45 +0000 (0:00:00.022) 0:01:36.107 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 17:59:46 +0000 (0:00:00.767) 0:01:36.875 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 17:59:46 +0000 (0:00:00.040) 0:01:36.916 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 17:59:46 +0000 (0:00:00.038) 0:01:36.954 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 17:59:47 +0000 (0:00:00.930) 0:01:37.885 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 17:59:47 +0000 (0:00:00.048) 0:01:37.933 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 17:59:47 +0000 (0:00:00.036) 0:01:37.970 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 17:59:47 +0000 (0:00:00.038) 0:01:38.009 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 17:59:47 +0000 (0:00:00.033) 0:01:38.042 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 17:59:48 +0000 (0:00:00.519) 0:01:38.561 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 17:59:49 +0000 (0:00:01.009) 0:01:39.570 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 17:59:49 +0000 (0:00:00.055) 0:01:39.625 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 17:59:49 +0000 (0:00:00.020) 0:01:39.646 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : failed message] ********************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:86 Thursday 21 July 2022 17:59:50 +0000 (0:00:00.991) 0:01:40.638 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': False, 'pools': [{'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, 'raid_spare_count': None, 'raid_disks': [], 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'partition', 'encryption_cipher': None, 'raid_spare_count': None}], 'volumes': [], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "encrypted volume 'test1' missing key/password", '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 17:59:50 +0000 (0:00:00.039) 0:01:40.677 ********* TASK [Check that we failed in the role] **************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:187 Thursday 21 July 2022 17:59:50 +0000 (0:00:00.020) 0:01:40.698 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the keyless luks test] ****************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:193 Thursday 21 July 2022 17:59:50 +0000 (0:00:00.036) 0:01:40.734 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:198 Thursday 21 July 2022 17:59:50 +0000 (0:00:00.046) 0:01:40.781 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 17:59:50 +0000 (0:00:00.038) 0:01:40.819 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 17:59:50 +0000 (0:00:00.034) 0:01:40.854 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 17:59:50 +0000 (0:00:00.444) 0:01:41.298 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 17:59:50 +0000 (0:00:00.060) 0:01:41.359 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 17:59:51 +0000 (0:00:00.073) 0:01:41.432 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 17:59:51 +0000 (0:00:00.073) 0:01:41.505 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 17:59:51 +0000 (0:00:00.056) 0:01:41.562 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 17:59:51 +0000 (0:00:00.021) 0:01:41.584 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 17:59:51 +0000 (0:00:00.699) 0:01:42.283 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 17:59:51 +0000 (0:00:00.038) 0:01:42.321 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 17:59:51 +0000 (0:00:00.039) 0:01:42.360 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 17:59:52 +0000 (0:00:00.901) 0:01:43.262 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 17:59:52 +0000 (0:00:00.048) 0:01:43.310 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 17:59:52 +0000 (0:00:00.032) 0:01:43.343 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 17:59:52 +0000 (0:00:00.039) 0:01:43.383 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 17:59:53 +0000 (0:00:00.036) 0:01:43.420 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 17:59:53 +0000 (0:00:00.553) 0:01:43.974 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 17:59:54 +0000 (0:00:00.969) 0:01:44.943 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 17:59:54 +0000 (0:00:00.059) 0:01:45.003 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 17:59:54 +0000 (0:00:00.021) 0:01:45.025 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/nvme1n1p1", "fs_type": null }, { "action": "create format", "device": "/dev/nvme1n1p1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1", "name": "luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "password": "-", "state": "absent" }, { "backing_device": "/dev/nvme1n1p1", "name": "luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 18:00:02 +0000 (0:00:07.828) 0:01:52.854 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:00:02 +0000 (0:00:00.036) 0:01:52.890 ********* TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 18:00:02 +0000 (0:00:00.020) 0:01:52.911 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/nvme1n1p1", "fs_type": null }, { "action": "create format", "device": "/dev/nvme1n1p1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1", "name": "luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "password": "-", "state": "absent" }, { "backing_device": "/dev/nvme1n1p1", "name": "luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 18:00:02 +0000 (0:00:00.038) 0:01:52.949 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 18:00:02 +0000 (0:00:00.076) 0:01:53.026 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 18:00:02 +0000 (0:00:00.036) 0:01:53.063 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 18:00:03 +0000 (0:00:00.381) 0:01:53.445 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 18:00:03 +0000 (0:00:00.529) 0:01:53.974 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 18:00:03 +0000 (0:00:00.359) 0:01:54.333 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 18:00:04 +0000 (0:00:00.455) 0:01:54.789 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426380.4877245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "7643a30de37d98c2d48f986201bdf0bc3d70def0", "ctime": 1658426377.4747245, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 12585524, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1658426377.4737246, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 57, "uid": 0, "version": "542786558", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 18:00:04 +0000 (0:00:00.315) 0:01:55.104 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'absent', 'password': '-', 'name': 'luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1', 'backing_device': '/dev/nvme1n1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/nvme1n1", "name": "luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [/cache/rhel-7.qcow2] => (item={'state': 'present', 'password': '-', 'name': 'luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a', 'backing_device': '/dev/nvme1n1p1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/nvme1n1p1", "name": "luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 18:00:05 +0000 (0:00:00.619) 0:01:55.723 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:215 Thursday 21 July 2022 18:00:06 +0000 (0:00:00.852) 0:01:56.576 ********* included: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:1 Thursday 21 July 2022 18:00:06 +0000 (0:00:00.039) 0:01:56.615 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:6 Thursday 21 July 2022 18:00:06 +0000 (0:00:00.052) 0:01:56.667 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:14 Thursday 21 July 2022 18:00:06 +0000 (0:00:00.034) 0:01:56.702 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "size": "10G", "type": "crypt", "uuid": "0374c12d-dcfb-470b-9981-3d186641db7a" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1p1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/nvme1n1p1", "size": "10G", "type": "partition", "uuid": "01395b91-59c1-4cd8-bd79-f70ece0d6b5a" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-17-57-57-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:19 Thursday 21 July 2022 18:00:06 +0000 (0:00:00.317) 0:01:57.019 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003745", "end": "2022-07-21 14:00:06.852470", "rc": 0, "start": "2022-07-21 14:00:06.848725" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 /dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:24 Thursday 21 July 2022 18:00:06 +0000 (0:00:00.291) 0:01:57.311 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003465", "end": "2022-07-21 14:00:07.160678", "failed_when_result": false, "rc": 0, "start": "2022-07-21 14:00:07.157213" } STDOUT: luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a /dev/nvme1n1p1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:33 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.307) 0:01:57.618 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml for /cache/rhel-7.qcow2 => (item={'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'partition', 'encryption_cipher': None, 'raid_spare_count': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml:5 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.060) 0:01:57.678 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml:18 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.039) 0:01:57.718 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml for /cache/rhel-7.qcow2 => (item=members) included: /tmp/tmptomayb7j/tests/storage/test-verify-pool-volumes.yml for /cache/rhel-7.qcow2 => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:1 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.056) 0:01:57.774 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:6 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.024) 0:01:57.799 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:15 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.023) 0:01:57.823 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:19 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.022) 0:01:57.845 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:23 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.022) 0:01:57.868 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:29 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.057) 0:01:57.925 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:33 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.022) 0:01:57.948 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:37 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.022) 0:01:57.970 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:41 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.021) 0:01:57.991 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check MD RAID] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:50 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.023) 0:01:58.014 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml for /cache/rhel-7.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:6 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.039) 0:01:58.054 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:12 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.021) 0:01:58.075 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:16 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.022) 0:01:58.097 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:20 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.021) 0:01:58.119 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:24 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.021) 0:01:58.140 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:30 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.021) 0:01:58.161 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:36 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.025) 0:01:58.187 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:44 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.024) 0:01:58.212 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:53 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.036) 0:01:58.248 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-lvmraid.yml for /cache/rhel-7.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.045) 0:01:58.293 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:56 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.027) 0:01:58.321 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-thin.yml for /cache/rhel-7.qcow2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-thin.yml:1 Thursday 21 July 2022 18:00:07 +0000 (0:00:00.043) 0:01:58.364 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:59 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.033) 0:01:58.397 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml for /cache/rhel-7.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.045) 0:01:58.443 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.047) 0:01:58.491 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.023) 0:01:58.515 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.022) 0:01:58.538 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:62 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.035) 0:01:58.573 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-vdo.yml for /cache/rhel-7.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.046) 0:01:58.620 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:65 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.029) 0:01:58.650 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.033) 0:01:58.683 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:2 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.042) 0:01:58.726 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:10 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.049) 0:01:58.775 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:6 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.084) 0:01:58.859 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:10 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.043) 0:01:58.903 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2609864, "block_size": 4096, "block_total": 2618112, "block_used": 8248, "device": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "fstype": "xfs", "inode_available": 5241341, "inode_total": 5241344, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10690002944, "size_total": 10723786752, "uuid": "0374c12d-dcfb-470b-9981-3d186641db7a" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2609864, "block_size": 4096, "block_total": 2618112, "block_used": 8248, "device": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "fstype": "xfs", "inode_available": 5241341, "inode_total": 5241344, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10690002944, "size_total": 10723786752, "uuid": "0374c12d-dcfb-470b-9981-3d186641db7a" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:20 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.056) 0:01:58.959 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:29 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.086) 0:01:59.045 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:37 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.085) 0:01:59.130 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:46 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.086) 0:01:59.217 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:50 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.024) 0:01:59.241 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:55 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.024) 0:01:59.265 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:65 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.026) 0:01:59.292 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.035) 0:01:59.327 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:12 Thursday 21 July 2022 18:00:08 +0000 (0:00:00.064) 0:01:59.392 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:19 Thursday 21 July 2022 18:00:09 +0000 (0:00:00.051) 0:01:59.444 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 18:00:09 +0000 (0:00:00.050) 0:01:59.494 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:34 Thursday 21 July 2022 18:00:09 +0000 (0:00:00.036) 0:01:59.531 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:4 Thursday 21 July 2022 18:00:09 +0000 (0:00:00.034) 0:01:59.565 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:10 Thursday 21 July 2022 18:00:09 +0000 (0:00:00.040) 0:01:59.605 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:4 Thursday 21 July 2022 18:00:09 +0000 (0:00:00.038) 0:01:59.644 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426402.2117245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426402.2117245, "dev": 5, "device_type": 66307, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 61619, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1658426402.2117245, "nlink": 1, "path": "/dev/nvme1n1p1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:10 Thursday 21 July 2022 18:00:09 +0000 (0:00:00.318) 0:01:59.962 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:15 Thursday 21 July 2022 18:00:09 +0000 (0:00:00.041) 0:02:00.004 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:21 Thursday 21 July 2022 18:00:09 +0000 (0:00:00.038) 0:02:00.042 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:25 Thursday 21 July 2022 18:00:09 +0000 (0:00:00.036) 0:02:00.079 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:30 Thursday 21 July 2022 18:00:09 +0000 (0:00:00.022) 0:02:00.102 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 18:00:09 +0000 (0:00:00.038) 0:02:00.141 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426402.3397245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426402.3397245, "dev": 5, "device_type": 64512, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 61622, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658426402.3397245, "nlink": 1, "path": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 18:00:10 +0000 (0:00:00.318) 0:02:00.460 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 18:00:10 +0000 (0:00:00.544) 0:02:01.004 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/nvme1n1p1" ], "delta": "0:00:00.040804", "end": "2022-07-21 14:00:10.894180", "rc": 0, "start": "2022-07-21 14:00:10.853376" } STDOUT: LUKS header information for /dev/nvme1n1p1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 9a e9 6e 21 a3 58 55 70 e4 69 7b 3e 8c 8f 32 3a 03 24 82 ec MK salt: 1d 96 f3 57 53 2c 69 6d 45 ed ff 03 1c 7c 7f a4 42 2e 12 15 fa 57 37 d7 ea 1d 3d 49 67 64 f4 de MK iterations: 22755 UUID: 01395b91-59c1-4cd8-bd79-f70ece0d6b5a Key Slot 0: ENABLED Iterations: 365102 Salt: c5 a6 b7 5d 5a 66 c4 03 f3 24 f3 74 2b bd 3a 79 3e 92 57 9d 3f e3 b2 7e bf 36 0d 52 2d a6 d5 ef Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 18:00:10 +0000 (0:00:00.354) 0:02:01.359 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:27 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.038) 0:02:01.397 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:33 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.051) 0:02:01.448 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:39 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.038) 0:02:01.487 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.039) 0:02:01.527 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:50 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.026) 0:02:01.553 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:56 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.024) 0:02:01.578 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:62 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.023) 0:02:01.601 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a /dev/nvme1n1p1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.052) 0:02:01.653 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:72 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.082) 0:02:01.736 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:78 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.054) 0:02:01.790 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:84 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.050) 0:02:01.841 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:90 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.084) 0:02:01.925 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:7 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.067) 0:02:01.993 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:13 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.035) 0:02:02.028 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:17 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.035) 0:02:02.065 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:21 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.037) 0:02:02.102 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:25 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.082) 0:02:02.184 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:31 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.039) 0:02:02.223 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:37 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.038) 0:02:02.262 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:3 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.035) 0:02:02.297 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:9 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.024) 0:02:02.322 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:15 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.034) 0:02:02.356 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:20 Thursday 21 July 2022 18:00:11 +0000 (0:00:00.036) 0:02:02.392 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:25 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.036) 0:02:02.428 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:28 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.043) 0:02:02.472 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:31 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.036) 0:02:02.509 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:36 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.035) 0:02:02.544 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:39 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.034) 0:02:02.579 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:44 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.038) 0:02:02.618 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:47 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.037) 0:02:02.655 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:50 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.038) 0:02:02.694 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:6 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.025) 0:02:02.719 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:14 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.026) 0:02:02.746 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:17 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.025) 0:02:02.771 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:22 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.025) 0:02:02.796 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:26 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.025) 0:02:02.821 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:32 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.035) 0:02:02.857 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:36 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.029) 0:02:02.886 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:16 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.023) 0:02:02.910 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:43 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.039) 0:02:02.949 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:53 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.025) 0:02:02.974 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/create-test-file.yml:10 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.034) 0:02:03.009 ********* changed: [/cache/rhel-7.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:221 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.319) 0:02:03.329 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:00:12 +0000 (0:00:00.038) 0:02:03.368 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:00:13 +0000 (0:00:00.034) 0:02:03.402 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:00:13 +0000 (0:00:00.435) 0:02:03.838 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:00:13 +0000 (0:00:00.063) 0:02:03.901 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:00:13 +0000 (0:00:00.074) 0:02:03.975 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:00:13 +0000 (0:00:00.119) 0:02:04.094 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:00:13 +0000 (0:00:00.057) 0:02:04.152 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:00:13 +0000 (0:00:00.022) 0:02:04.174 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:00:14 +0000 (0:00:00.747) 0:02:04.921 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:00:14 +0000 (0:00:00.037) 0:02:04.959 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:00:14 +0000 (0:00:00.035) 0:02:04.995 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:00:15 +0000 (0:00:01.001) 0:02:05.996 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:00:15 +0000 (0:00:00.046) 0:02:06.043 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:00:15 +0000 (0:00:00.035) 0:02:06.078 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:00:15 +0000 (0:00:00.040) 0:02:06.118 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:00:15 +0000 (0:00:00.034) 0:02:06.152 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:00:16 +0000 (0:00:00.520) 0:02:06.673 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2db33e637e\\x2dac54\\x2d4d1f\\x2da39e\\x2de9eff8d644f1.service": { "name": "systemd-cryptsetup@luks\\x2db33e637e\\x2dac54\\x2d4d1f\\x2da39e\\x2de9eff8d644f1.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:00:17 +0000 (0:00:01.015) 0:02:07.688 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2db33e637e\\x2dac54\\x2d4d1f\\x2da39e\\x2de9eff8d644f1.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:00:17 +0000 (0:00:00.060) 0:02:07.749 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2db33e637e\x2dac54\x2d4d1f\x2da39e\x2de9eff8d644f1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2db33e637e\\x2dac54\\x2d4d1f\\x2da39e\\x2de9eff8d644f1.service", "name": "systemd-cryptsetup@luks\\x2db33e637e\\x2dac54\\x2d4d1f\\x2da39e\\x2de9eff8d644f1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket systemd-readahead-collect.service dev-nvme1n1.device system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-nvme1n1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1 /dev/nvme1n1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-b33e637e-ac54-4d1f-a39e-e9eff8d644f1 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2db33e637e\\x2dac54\\x2d4d1f\\x2da39e\\x2de9eff8d644f1.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2db33e637e\\x2dac54\\x2d4d1f\\x2da39e\\x2de9eff8d644f1.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2db33e637e\\x2dac54\\x2d4d1f\\x2da39e\\x2de9eff8d644f1.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-nvme1n1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:00:17 +0000 (0:00:00.476) 0:02:08.226 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : failed message] ********************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:86 Thursday 21 July 2022 18:00:18 +0000 (0:00:01.032) 0:02:09.258 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': True, 'pools': [{'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, 'raid_spare_count': None, 'raid_disks': [], 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'partition', 'encryption_cipher': None, 'raid_spare_count': None}], 'volumes': [], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "cannot remove existing formatting on device 'luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a' in safe mode due to encryption removal", '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:00:18 +0000 (0:00:00.040) 0:02:09.299 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2db33e637e\x2dac54\x2d4d1f\x2da39e\x2de9eff8d644f1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2db33e637e\\x2dac54\\x2d4d1f\\x2da39e\\x2de9eff8d644f1.service", "name": "systemd-cryptsetup@luks\\x2db33e637e\\x2dac54\\x2d4d1f\\x2da39e\\x2de9eff8d644f1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2db33e637e\\x2dac54\\x2d4d1f\\x2da39e\\x2de9eff8d644f1.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2db33e637e\\x2dac54\\x2d4d1f\\x2da39e\\x2de9eff8d644f1.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2db33e637e\\x2dac54\\x2d4d1f\\x2da39e\\x2de9eff8d644f1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:240 Thursday 21 July 2022 18:00:19 +0000 (0:00:00.490) 0:02:09.789 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:246 Thursday 21 July 2022 18:00:19 +0000 (0:00:00.081) 0:02:09.871 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml:10 Thursday 21 July 2022 18:00:19 +0000 (0:00:00.047) 0:02:09.919 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426412.8737245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658426412.8737245, "dev": 64512, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1658426412.8737245, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "135120590", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml:15 Thursday 21 July 2022 18:00:19 +0000 (0:00:00.321) 0:02:10.241 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:253 Thursday 21 July 2022 18:00:19 +0000 (0:00:00.042) 0:02:10.283 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:00:19 +0000 (0:00:00.038) 0:02:10.322 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:00:19 +0000 (0:00:00.047) 0:02:10.370 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:00:20 +0000 (0:00:00.409) 0:02:10.779 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:00:20 +0000 (0:00:00.063) 0:02:10.843 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:00:20 +0000 (0:00:00.039) 0:02:10.883 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:00:20 +0000 (0:00:00.035) 0:02:10.919 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:00:20 +0000 (0:00:00.058) 0:02:10.977 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:00:20 +0000 (0:00:00.021) 0:02:10.999 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:00:21 +0000 (0:00:00.737) 0:02:11.737 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:00:21 +0000 (0:00:00.040) 0:02:11.777 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:00:21 +0000 (0:00:00.036) 0:02:11.814 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:00:22 +0000 (0:00:01.033) 0:02:12.848 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:00:22 +0000 (0:00:00.047) 0:02:12.895 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:00:22 +0000 (0:00:00.033) 0:02:12.929 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:00:22 +0000 (0:00:00.038) 0:02:12.967 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:00:22 +0000 (0:00:00.034) 0:02:13.002 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:00:23 +0000 (0:00:00.590) 0:02:13.592 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service": { "name": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:00:24 +0000 (0:00:01.002) 0:02:14.595 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:00:24 +0000 (0:00:00.057) 0:02:14.653 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2d01395b91\x2d59c1\x2d4cd8\x2dbd79\x2df70ece0d6b5a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "name": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service cryptsetup-pre.target systemd-readahead-replay.service dev-nvme1n1p1.device system-systemd\\x2dcryptsetup.slice systemd-journald.socket", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-nvme1n1p1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a /dev/nvme1n1p1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-nvme1n1p1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:00:24 +0000 (0:00:00.481) 0:02:15.135 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1p1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/nvme1n1p1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1p1", "name": "luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme1n1p1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/nvme1n1p1", "_kernel_device": "/dev/nvme1n1p1", "_mount_id": "UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 18:00:26 +0000 (0:00:01.496) 0:02:16.631 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:00:26 +0000 (0:00:00.039) 0:02:16.671 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2d01395b91\x2d59c1\x2d4cd8\x2dbd79\x2df70ece0d6b5a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "name": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.device", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-nvme1n1p1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 18:00:26 +0000 (0:00:00.462) 0:02:17.134 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1p1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/nvme1n1p1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1p1", "name": "luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme1n1p1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/nvme1n1p1", "_kernel_device": "/dev/nvme1n1p1", "_mount_id": "UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 18:00:26 +0000 (0:00:00.048) 0:02:17.182 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/nvme1n1p1", "_kernel_device": "/dev/nvme1n1p1", "_mount_id": "UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 18:00:26 +0000 (0:00:00.037) 0:02:17.220 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 18:00:26 +0000 (0:00:00.045) 0:02:17.265 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 18:00:27 +0000 (0:00:00.344) 0:02:17.610 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 18:00:27 +0000 (0:00:00.480) 0:02:18.091 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': 'UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 18:00:28 +0000 (0:00:00.395) 0:02:18.486 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 18:00:28 +0000 (0:00:00.446) 0:02:18.932 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426407.1597245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "d688fb2d2b23fabe2b802af909460928b3dab341", "ctime": 1658426405.2597246, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 16785288, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1658426405.2587245, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 59, "uid": 0, "version": "18446744073552731307", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 18:00:28 +0000 (0:00:00.372) 0:02:19.305 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'absent', 'password': '-', 'name': 'luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a', 'backing_device': '/dev/nvme1n1p1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/nvme1n1p1", "name": "luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 18:00:29 +0000 (0:00:00.395) 0:02:19.700 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:270 Thursday 21 July 2022 18:00:30 +0000 (0:00:00.933) 0:02:20.634 ********* included: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:1 Thursday 21 July 2022 18:00:30 +0000 (0:00:00.044) 0:02:20.679 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/nvme1n1p1", "_kernel_device": "/dev/nvme1n1p1", "_mount_id": "UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:6 Thursday 21 July 2022 18:00:30 +0000 (0:00:00.055) 0:02:20.734 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:14 Thursday 21 July 2022 18:00:30 +0000 (0:00:00.038) 0:02:20.773 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1p1": { "fstype": "xfs", "label": "", "name": "/dev/nvme1n1p1", "size": "10G", "type": "partition", "uuid": "ee17d1e9-4e99-471d-9d3e-cde5291c2929" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-17-57-57-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:19 Thursday 21 July 2022 18:00:30 +0000 (0:00:00.314) 0:02:21.088 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003428", "end": "2022-07-21 14:00:30.937959", "rc": 0, "start": "2022-07-21 14:00:30.934531" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:24 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.308) 0:02:21.397 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003308", "end": "2022-07-21 14:00:31.243029", "failed_when_result": false, "rc": 0, "start": "2022-07-21 14:00:31.239721" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:33 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.305) 0:02:21.702 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml for /cache/rhel-7.qcow2 => (item={'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/nvme1n1p1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/nvme1n1p1', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': 'UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'partition', 'encryption_cipher': None, 'raid_spare_count': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml:5 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.057) 0:02:21.760 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml:18 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.036) 0:02:21.796 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml for /cache/rhel-7.qcow2 => (item=members) included: /tmp/tmptomayb7j/tests/storage/test-verify-pool-volumes.yml for /cache/rhel-7.qcow2 => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:1 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.044) 0:02:21.841 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:6 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.025) 0:02:21.866 ********* TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:15 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.023) 0:02:21.889 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:19 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.025) 0:02:21.915 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:23 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.027) 0:02:21.943 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:29 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.025) 0:02:21.968 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:33 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.026) 0:02:21.995 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:37 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.024) 0:02:22.020 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:41 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.024) 0:02:22.044 ********* TASK [Check MD RAID] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:50 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.027) 0:02:22.072 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml for /cache/rhel-7.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:6 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.056) 0:02:22.128 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:12 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.025) 0:02:22.154 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:16 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.023) 0:02:22.177 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:20 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.023) 0:02:22.201 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:24 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.059) 0:02:22.261 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:30 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.024) 0:02:22.285 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:36 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.022) 0:02:22.308 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:44 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.025) 0:02:22.333 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:53 Thursday 21 July 2022 18:00:31 +0000 (0:00:00.035) 0:02:22.369 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-lvmraid.yml for /cache/rhel-7.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.042) 0:02:22.412 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/nvme1n1p1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/nvme1n1p1', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': 'UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/nvme1n1p1", "_kernel_device": "/dev/nvme1n1p1", "_mount_id": "UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:56 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.032) 0:02:22.445 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-thin.yml for /cache/rhel-7.qcow2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-thin.yml:1 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.045) 0:02:22.490 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/nvme1n1p1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/nvme1n1p1', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': 'UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/nvme1n1p1", "_kernel_device": "/dev/nvme1n1p1", "_mount_id": "UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:59 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.031) 0:02:22.521 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml for /cache/rhel-7.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.047) 0:02:22.569 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.049) 0:02:22.618 ********* TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.021) 0:02:22.640 ********* TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.024) 0:02:22.665 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:62 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.035) 0:02:22.700 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-vdo.yml for /cache/rhel-7.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.046) 0:02:22.747 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/nvme1n1p1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/nvme1n1p1', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': 'UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/nvme1n1p1", "_kernel_device": "/dev/nvme1n1p1", "_mount_id": "UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:65 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.028) 0:02:22.776 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.037) 0:02:22.813 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/nvme1n1p1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/nvme1n1p1', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': 'UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:2 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.043) 0:02:22.857 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:10 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.050) 0:02:22.908 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:6 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.088) 0:02:22.996 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/nvme1n1p1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:10 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.047) 0:02:23.043 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2610376, "block_size": 4096, "block_total": 2618624, "block_used": 8248, "device": "/dev/nvme1n1p1", "fstype": "xfs", "inode_available": 5242365, "inode_total": 5242368, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10692100096, "size_total": 10725883904, "uuid": "ee17d1e9-4e99-471d-9d3e-cde5291c2929" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2610376, "block_size": 4096, "block_total": 2618624, "block_used": 8248, "device": "/dev/nvme1n1p1", "fstype": "xfs", "inode_available": 5242365, "inode_total": 5242368, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10692100096, "size_total": 10725883904, "uuid": "ee17d1e9-4e99-471d-9d3e-cde5291c2929" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:20 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.059) 0:02:23.103 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:29 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.052) 0:02:23.156 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:37 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.049) 0:02:23.205 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:46 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.051) 0:02:23.256 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:50 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.022) 0:02:23.279 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:55 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.023) 0:02:23.303 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:65 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.028) 0:02:23.332 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 18:00:32 +0000 (0:00:00.038) 0:02:23.370 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:12 Thursday 21 July 2022 18:00:33 +0000 (0:00:00.062) 0:02:23.432 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:19 Thursday 21 July 2022 18:00:33 +0000 (0:00:00.094) 0:02:23.526 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 18:00:33 +0000 (0:00:00.133) 0:02:23.660 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:34 Thursday 21 July 2022 18:00:33 +0000 (0:00:00.038) 0:02:23.698 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:4 Thursday 21 July 2022 18:00:33 +0000 (0:00:00.035) 0:02:23.734 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:10 Thursday 21 July 2022 18:00:33 +0000 (0:00:00.042) 0:02:23.776 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:4 Thursday 21 July 2022 18:00:33 +0000 (0:00:00.042) 0:02:23.819 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426426.1367245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426426.1367245, "dev": 5, "device_type": 66307, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 70955, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1658426426.1367245, "nlink": 1, "path": "/dev/nvme1n1p1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:10 Thursday 21 July 2022 18:00:33 +0000 (0:00:00.332) 0:02:24.151 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:15 Thursday 21 July 2022 18:00:33 +0000 (0:00:00.038) 0:02:24.190 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:21 Thursday 21 July 2022 18:00:33 +0000 (0:00:00.038) 0:02:24.228 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:25 Thursday 21 July 2022 18:00:33 +0000 (0:00:00.040) 0:02:24.268 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:30 Thursday 21 July 2022 18:00:33 +0000 (0:00:00.023) 0:02:24.291 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 18:00:33 +0000 (0:00:00.040) 0:02:24.332 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 18:00:33 +0000 (0:00:00.023) 0:02:24.355 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 18:00:34 +0000 (0:00:00.524) 0:02:24.880 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 18:00:34 +0000 (0:00:00.023) 0:02:24.903 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:27 Thursday 21 July 2022 18:00:34 +0000 (0:00:00.021) 0:02:24.925 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:33 Thursday 21 July 2022 18:00:34 +0000 (0:00:00.047) 0:02:24.973 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:39 Thursday 21 July 2022 18:00:34 +0000 (0:00:00.021) 0:02:24.994 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 18:00:34 +0000 (0:00:00.022) 0:02:25.016 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:50 Thursday 21 July 2022 18:00:34 +0000 (0:00:00.021) 0:02:25.038 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:56 Thursday 21 July 2022 18:00:34 +0000 (0:00:00.021) 0:02:25.059 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:62 Thursday 21 July 2022 18:00:34 +0000 (0:00:00.021) 0:02:25.080 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 18:00:34 +0000 (0:00:00.048) 0:02:25.129 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:72 Thursday 21 July 2022 18:00:34 +0000 (0:00:00.044) 0:02:25.174 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:78 Thursday 21 July 2022 18:00:34 +0000 (0:00:00.033) 0:02:25.207 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:84 Thursday 21 July 2022 18:00:34 +0000 (0:00:00.032) 0:02:25.239 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:90 Thursday 21 July 2022 18:00:34 +0000 (0:00:00.039) 0:02:25.279 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:7 Thursday 21 July 2022 18:00:34 +0000 (0:00:00.036) 0:02:25.316 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:13 Thursday 21 July 2022 18:00:34 +0000 (0:00:00.038) 0:02:25.354 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:17 Thursday 21 July 2022 18:00:34 +0000 (0:00:00.037) 0:02:25.391 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:21 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.040) 0:02:25.431 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:25 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.035) 0:02:25.467 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:31 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.038) 0:02:25.505 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:37 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.037) 0:02:25.542 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:3 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.040) 0:02:25.583 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:9 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.025) 0:02:25.609 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:15 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.036) 0:02:25.645 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:20 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.036) 0:02:25.681 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:25 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.039) 0:02:25.721 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:28 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.039) 0:02:25.761 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:31 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.036) 0:02:25.797 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:36 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.035) 0:02:25.833 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:39 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.039) 0:02:25.872 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:44 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.035) 0:02:25.908 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:47 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.071) 0:02:25.979 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:50 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.104) 0:02:26.084 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:6 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.024) 0:02:26.109 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:14 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.023) 0:02:26.132 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:17 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.023) 0:02:26.155 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:22 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.023) 0:02:26.179 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:26 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.025) 0:02:26.205 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:32 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.025) 0:02:26.230 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:36 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.025) 0:02:26.256 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:16 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.027) 0:02:26.283 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:43 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.036) 0:02:26.319 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:53 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.023) 0:02:26.343 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/create-test-file.yml:10 Thursday 21 July 2022 18:00:35 +0000 (0:00:00.036) 0:02:26.379 ********* changed: [/cache/rhel-7.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:276 Thursday 21 July 2022 18:00:36 +0000 (0:00:00.354) 0:02:26.734 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:00:36 +0000 (0:00:00.045) 0:02:26.779 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:00:36 +0000 (0:00:00.037) 0:02:26.816 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:00:36 +0000 (0:00:00.408) 0:02:27.224 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:00:36 +0000 (0:00:00.066) 0:02:27.291 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:00:36 +0000 (0:00:00.036) 0:02:27.328 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:00:36 +0000 (0:00:00.037) 0:02:27.365 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:00:37 +0000 (0:00:00.057) 0:02:27.423 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:00:37 +0000 (0:00:00.022) 0:02:27.446 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:00:37 +0000 (0:00:00.719) 0:02:28.165 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:00:37 +0000 (0:00:00.040) 0:02:28.206 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:00:37 +0000 (0:00:00.039) 0:02:28.245 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:00:38 +0000 (0:00:00.992) 0:02:29.238 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:00:38 +0000 (0:00:00.094) 0:02:29.333 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:00:38 +0000 (0:00:00.037) 0:02:29.370 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:00:39 +0000 (0:00:00.037) 0:02:29.408 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:00:39 +0000 (0:00:00.035) 0:02:29.443 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:00:39 +0000 (0:00:00.531) 0:02:29.975 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service": { "name": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:00:40 +0000 (0:00:01.018) 0:02:30.993 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:00:40 +0000 (0:00:00.061) 0:02:31.055 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2d01395b91\x2d59c1\x2d4cd8\x2dbd79\x2df70ece0d6b5a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "name": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target systemd-journald.socket systemd-readahead-collect.service system-systemd\\x2dcryptsetup.slice systemd-readahead-replay.service dev-nvme1n1p1.device", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-nvme1n1p1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a /dev/nvme1n1p1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-01395b91-59c1-4cd8-bd79-f70ece0d6b5a ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-nvme1n1p1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:00:41 +0000 (0:00:00.476) 0:02:31.532 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'nvme1n1p1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : failed message] ********************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:86 Thursday 21 July 2022 18:00:42 +0000 (0:00:00.975) 0:02:32.508 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': True, 'pools': [{'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, 'raid_spare_count': None, 'raid_disks': [], 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'partition', 'encryption_cipher': None, 'raid_spare_count': None}], 'volumes': [], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "cannot remove existing formatting on device 'nvme1n1p1' in safe mode due to adding encryption", '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:00:42 +0000 (0:00:00.041) 0:02:32.549 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2d01395b91\x2d59c1\x2d4cd8\x2dbd79\x2df70ece0d6b5a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "name": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d01395b91\\x2d59c1\\x2d4cd8\\x2dbd79\\x2df70ece0d6b5a.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:297 Thursday 21 July 2022 18:00:42 +0000 (0:00:00.470) 0:02:33.020 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:303 Thursday 21 July 2022 18:00:42 +0000 (0:00:00.040) 0:02:33.060 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml:10 Thursday 21 July 2022 18:00:42 +0000 (0:00:00.055) 0:02:33.116 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426436.2577245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658426436.2577245, "dev": 66307, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1658426436.2577245, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1404095774", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml:15 Thursday 21 July 2022 18:00:43 +0000 (0:00:00.327) 0:02:33.444 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:312 Thursday 21 July 2022 18:00:43 +0000 (0:00:00.043) 0:02:33.487 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testkPLDKVlukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:319 Thursday 21 July 2022 18:00:43 +0000 (0:00:00.479) 0:02:33.967 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testkPLDKVlukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1658426443.6250336-161957-110544134833627/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:326 Thursday 21 July 2022 18:00:44 +0000 (0:00:00.738) 0:02:34.705 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:00:44 +0000 (0:00:00.040) 0:02:34.746 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:00:44 +0000 (0:00:00.038) 0:02:34.785 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:00:44 +0000 (0:00:00.472) 0:02:35.257 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:00:44 +0000 (0:00:00.110) 0:02:35.367 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:00:45 +0000 (0:00:00.037) 0:02:35.405 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:00:45 +0000 (0:00:00.039) 0:02:35.445 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:00:45 +0000 (0:00:00.062) 0:02:35.507 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:00:45 +0000 (0:00:00.023) 0:02:35.531 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:00:45 +0000 (0:00:00.662) 0:02:36.193 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testkPLDKVlukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:00:45 +0000 (0:00:00.040) 0:02:36.233 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:00:45 +0000 (0:00:00.037) 0:02:36.271 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:00:46 +0000 (0:00:01.005) 0:02:37.276 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:00:46 +0000 (0:00:00.048) 0:02:37.325 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:00:46 +0000 (0:00:00.036) 0:02:37.362 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:00:47 +0000 (0:00:00.039) 0:02:37.401 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:00:47 +0000 (0:00:00.039) 0:02:37.440 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:00:47 +0000 (0:00:00.573) 0:02:38.014 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:00:48 +0000 (0:00:01.050) 0:02:39.064 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:00:48 +0000 (0:00:00.058) 0:02:39.123 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:00:48 +0000 (0:00:00.021) 0:02:39.144 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/nvme1n1p1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/nvme1n1p1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1p1", "name": "luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "password": "/tmp/storage_testkPLDKVlukskey", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testkPLDKVlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 18:00:56 +0000 (0:00:07.630) 0:02:46.775 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:00:56 +0000 (0:00:00.045) 0:02:46.820 ********* TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 18:00:56 +0000 (0:00:00.026) 0:02:46.846 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/nvme1n1p1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/nvme1n1p1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1p1", "name": "luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "password": "/tmp/storage_testkPLDKVlukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testkPLDKVlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 18:00:56 +0000 (0:00:00.091) 0:02:46.938 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testkPLDKVlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 18:00:56 +0000 (0:00:00.140) 0:02:47.078 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 18:00:56 +0000 (0:00:00.039) 0:02:47.118 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': 'UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=ee17d1e9-4e99-471d-9d3e-cde5291c2929" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 18:00:57 +0000 (0:00:00.352) 0:02:47.470 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 18:00:57 +0000 (0:00:00.473) 0:02:47.943 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 18:00:57 +0000 (0:00:00.368) 0:02:48.311 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 18:00:58 +0000 (0:00:00.456) 0:02:48.768 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426431.2417245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658426429.2317245, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 12585524, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658426429.2307246, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "542786568", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 18:00:58 +0000 (0:00:00.356) 0:02:49.124 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'present', 'password': '/tmp/storage_testkPLDKVlukskey', 'name': 'luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34', 'backing_device': '/dev/nvme1n1p1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/nvme1n1p1", "name": "luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "password": "/tmp/storage_testkPLDKVlukskey", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 18:00:59 +0000 (0:00:00.340) 0:02:49.464 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:343 Thursday 21 July 2022 18:00:59 +0000 (0:00:00.849) 0:02:50.314 ********* included: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:1 Thursday 21 July 2022 18:00:59 +0000 (0:00:00.038) 0:02:50.352 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testkPLDKVlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:6 Thursday 21 July 2022 18:01:00 +0000 (0:00:00.054) 0:02:50.407 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:14 Thursday 21 July 2022 18:01:00 +0000 (0:00:00.044) 0:02:50.452 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "size": "10G", "type": "crypt", "uuid": "a5e0139e-6e18-476f-8416-7263138079c6" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1p1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/nvme1n1p1", "size": "10G", "type": "partition", "uuid": "6192fdd9-f7eb-4e8a-aed8-d86682e15c34" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-17-57-57-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:19 Thursday 21 July 2022 18:01:01 +0000 (0:00:01.320) 0:02:51.772 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003376", "end": "2022-07-21 14:01:01.624773", "rc": 0, "start": "2022-07-21 14:01:01.621397" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 /dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:24 Thursday 21 July 2022 18:01:01 +0000 (0:00:00.314) 0:02:52.086 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003368", "end": "2022-07-21 14:01:01.939559", "failed_when_result": false, "rc": 0, "start": "2022-07-21 14:01:01.936191" } STDOUT: luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34 /dev/nvme1n1p1 /tmp/storage_testkPLDKVlukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:33 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.312) 0:02:52.399 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml for /cache/rhel-7.qcow2 => (item={'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testkPLDKVlukskey', 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'partition', 'encryption_cipher': None, 'raid_spare_count': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml:5 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.060) 0:02:52.459 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml:18 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.078) 0:02:52.538 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml for /cache/rhel-7.qcow2 => (item=members) included: /tmp/tmptomayb7j/tests/storage/test-verify-pool-volumes.yml for /cache/rhel-7.qcow2 => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:1 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.088) 0:02:52.626 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:6 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.024) 0:02:52.651 ********* TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:15 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.020) 0:02:52.672 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:19 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.023) 0:02:52.696 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:23 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.022) 0:02:52.718 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:29 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.021) 0:02:52.740 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:33 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.024) 0:02:52.765 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:37 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.026) 0:02:52.791 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:41 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.024) 0:02:52.816 ********* TASK [Check MD RAID] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:50 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.021) 0:02:52.838 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml for /cache/rhel-7.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:6 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.042) 0:02:52.881 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:12 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.026) 0:02:52.907 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:16 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.025) 0:02:52.932 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:20 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.024) 0:02:52.957 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:24 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.024) 0:02:52.981 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:30 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.022) 0:02:53.004 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:36 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.023) 0:02:53.027 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:44 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.024) 0:02:53.052 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:53 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.043) 0:02:53.095 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-lvmraid.yml for /cache/rhel-7.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.046) 0:02:53.142 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testkPLDKVlukskey', 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testkPLDKVlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:56 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.031) 0:02:53.174 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-thin.yml for /cache/rhel-7.qcow2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-thin.yml:1 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.047) 0:02:53.221 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testkPLDKVlukskey', 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testkPLDKVlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:59 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.030) 0:02:53.252 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml for /cache/rhel-7.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.046) 0:02:53.299 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.048) 0:02:53.347 ********* TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.020) 0:02:53.368 ********* TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 18:01:02 +0000 (0:00:00.020) 0:02:53.389 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:62 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.036) 0:02:53.425 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-vdo.yml for /cache/rhel-7.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.047) 0:02:53.472 ********* skipping: [/cache/rhel-7.qcow2] => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testkPLDKVlukskey', 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "_raw_device": "/dev/nvme1n1p1", "_raw_kernel_device": "/dev/nvme1n1p1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testkPLDKVlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:65 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.028) 0:02:53.501 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.036) 0:02:53.537 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/nvme1n1p1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'partition', '_device': '/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-0', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testkPLDKVlukskey', 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/nvme1n1p1', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:2 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.043) 0:02:53.581 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:10 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.054) 0:02:53.635 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:6 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.142) 0:02:53.778 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:10 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.043) 0:02:53.821 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 2609864, "block_size": 4096, "block_total": 2618112, "block_used": 8248, "device": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "fstype": "xfs", "inode_available": 5241341, "inode_total": 5241344, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10690002944, "size_total": 10723786752, "uuid": "a5e0139e-6e18-476f-8416-7263138079c6" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 2609864, "block_size": 4096, "block_total": 2618112, "block_used": 8248, "device": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "fstype": "xfs", "inode_available": 5241341, "inode_total": 5241344, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 10690002944, "size_total": 10723786752, "uuid": "a5e0139e-6e18-476f-8416-7263138079c6" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:20 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.055) 0:02:53.877 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:29 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.052) 0:02:53.929 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:37 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.051) 0:02:53.981 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:46 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.054) 0:02:54.036 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:50 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.025) 0:02:54.061 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:55 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.026) 0:02:54.087 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:65 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.025) 0:02:54.112 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.035) 0:02:54.148 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:12 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.062) 0:02:54.210 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:19 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.054) 0:02:54.265 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.049) 0:02:54.315 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:34 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.039) 0:02:54.354 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:4 Thursday 21 July 2022 18:01:03 +0000 (0:00:00.036) 0:02:54.391 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:10 Thursday 21 July 2022 18:01:04 +0000 (0:00:00.040) 0:02:54.432 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:4 Thursday 21 July 2022 18:01:04 +0000 (0:00:00.041) 0:02:54.473 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426456.1397245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426456.1397245, "dev": 5, "device_type": 66307, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 80129, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1658426456.1397245, "nlink": 1, "path": "/dev/nvme1n1p1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:10 Thursday 21 July 2022 18:01:04 +0000 (0:00:00.328) 0:02:54.802 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:15 Thursday 21 July 2022 18:01:04 +0000 (0:00:00.038) 0:02:54.840 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:21 Thursday 21 July 2022 18:01:04 +0000 (0:00:00.039) 0:02:54.879 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:25 Thursday 21 July 2022 18:01:04 +0000 (0:00:00.036) 0:02:54.916 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:30 Thursday 21 July 2022 18:01:04 +0000 (0:00:00.026) 0:02:54.942 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 18:01:04 +0000 (0:00:00.045) 0:02:54.987 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426456.2657247, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426456.2657247, "dev": 5, "device_type": 64512, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 80162, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658426456.2657247, "nlink": 1, "path": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 18:01:04 +0000 (0:00:00.320) 0:02:55.308 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 18:01:05 +0000 (0:00:00.545) 0:02:55.853 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/nvme1n1p1" ], "delta": "0:00:00.036640", "end": "2022-07-21 14:01:05.748204", "rc": 0, "start": "2022-07-21 14:01:05.711564" } STDOUT: LUKS header information for /dev/nvme1n1p1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 64 d7 d4 8c e1 28 b3 49 35 e9 2a ce 36 d9 48 ec a4 43 b1 46 MK salt: 90 d0 0f 03 4d 2c c6 54 c3 a9 b5 23 04 93 c8 98 1b 69 b9 33 0e c3 17 50 29 95 e9 55 ab b4 a0 1f MK iterations: 22882 UUID: 6192fdd9-f7eb-4e8a-aed8-d86682e15c34 Key Slot 0: ENABLED Iterations: 365612 Salt: e6 99 39 83 81 04 a9 f4 42 f3 bb 34 66 d7 a3 6e d7 ae 31 e7 d9 fe 99 69 eb 5a 9d 7e 7c 42 80 a4 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 18:01:05 +0000 (0:00:00.359) 0:02:56.212 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:27 Thursday 21 July 2022 18:01:05 +0000 (0:00:00.097) 0:02:56.309 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:33 Thursday 21 July 2022 18:01:05 +0000 (0:00:00.060) 0:02:56.370 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:39 Thursday 21 July 2022 18:01:06 +0000 (0:00:00.041) 0:02:56.412 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 18:01:06 +0000 (0:00:00.089) 0:02:56.501 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:50 Thursday 21 July 2022 18:01:06 +0000 (0:00:00.027) 0:02:56.529 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:56 Thursday 21 July 2022 18:01:06 +0000 (0:00:00.026) 0:02:56.555 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:62 Thursday 21 July 2022 18:01:06 +0000 (0:00:00.027) 0:02:56.583 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34 /dev/nvme1n1p1 /tmp/storage_testkPLDKVlukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_testkPLDKVlukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 18:01:06 +0000 (0:00:00.094) 0:02:56.677 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:72 Thursday 21 July 2022 18:01:06 +0000 (0:00:00.048) 0:02:56.726 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:78 Thursday 21 July 2022 18:01:06 +0000 (0:00:00.101) 0:02:56.827 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:84 Thursday 21 July 2022 18:01:06 +0000 (0:00:00.091) 0:02:56.918 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:90 Thursday 21 July 2022 18:01:06 +0000 (0:00:00.091) 0:02:57.010 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:7 Thursday 21 July 2022 18:01:06 +0000 (0:00:00.083) 0:02:57.094 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:13 Thursday 21 July 2022 18:01:06 +0000 (0:00:00.082) 0:02:57.176 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:17 Thursday 21 July 2022 18:01:06 +0000 (0:00:00.038) 0:02:57.215 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:21 Thursday 21 July 2022 18:01:06 +0000 (0:00:00.037) 0:02:57.253 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:25 Thursday 21 July 2022 18:01:06 +0000 (0:00:00.039) 0:02:57.293 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:31 Thursday 21 July 2022 18:01:06 +0000 (0:00:00.037) 0:02:57.330 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:37 Thursday 21 July 2022 18:01:06 +0000 (0:00:00.038) 0:02:57.369 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:3 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.039) 0:02:57.408 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:9 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.027) 0:02:57.436 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:15 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.037) 0:02:57.473 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:20 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.038) 0:02:57.512 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:25 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.039) 0:02:57.551 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:28 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.039) 0:02:57.591 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:31 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.038) 0:02:57.629 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:36 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.038) 0:02:57.668 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:39 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.038) 0:02:57.706 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:44 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.040) 0:02:57.747 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:47 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.037) 0:02:57.784 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:50 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.037) 0:02:57.821 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:6 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.025) 0:02:57.847 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:14 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.026) 0:02:57.873 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:17 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.024) 0:02:57.898 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:22 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.025) 0:02:57.924 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:26 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.024) 0:02:57.948 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:32 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.026) 0:02:57.974 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:36 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.026) 0:02:58.000 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:16 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.026) 0:02:58.027 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:43 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.035) 0:02:58.062 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:53 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.024) 0:02:58.087 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:345 Thursday 21 July 2022 18:01:07 +0000 (0:00:00.037) 0:02:58.124 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "path": "/tmp/storage_testkPLDKVlukskey", "state": "absent" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:357 Thursday 21 July 2022 18:01:08 +0000 (0:00:00.319) 0:02:58.443 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:01:08 +0000 (0:00:00.040) 0:02:58.483 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:01:08 +0000 (0:00:00.035) 0:02:58.519 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:01:08 +0000 (0:00:00.479) 0:02:58.999 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:01:08 +0000 (0:00:00.064) 0:02:59.063 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:01:08 +0000 (0:00:00.046) 0:02:59.110 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:01:08 +0000 (0:00:00.041) 0:02:59.151 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:01:08 +0000 (0:00:00.061) 0:02:59.213 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:01:08 +0000 (0:00:00.024) 0:02:59.237 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:01:09 +0000 (0:00:00.745) 0:02:59.982 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:01:09 +0000 (0:00:00.041) 0:03:00.024 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:01:09 +0000 (0:00:00.040) 0:03:00.065 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:01:10 +0000 (0:00:01.019) 0:03:01.084 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:01:10 +0000 (0:00:00.047) 0:03:01.132 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:01:10 +0000 (0:00:00.037) 0:03:01.169 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:01:10 +0000 (0:00:00.041) 0:03:01.210 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:01:10 +0000 (0:00:00.034) 0:03:01.245 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed" ] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:01:11 +0000 (0:00:00.529) 0:03:01.774 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:01:12 +0000 (0:00:01.013) 0:03:02.787 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:01:12 +0000 (0:00:00.059) 0:03:02.846 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:01:12 +0000 (0:00:00.022) 0:03:02.869 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : failed message] ********************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:86 Thursday 21 July 2022 18:01:13 +0000 (0:00:01.105) 0:03:03.975 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': False, 'pools': [{'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, 'raid_spare_count': None, 'raid_disks': [], 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'lvm', 'encryption_cipher': None, 'raid_spare_count': None}], 'volumes': [], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "encrypted volume 'test1' missing key/password", '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:01:13 +0000 (0:00:00.042) 0:03:04.018 ********* TASK [Check that we failed in the role] **************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:377 Thursday 21 July 2022 18:01:13 +0000 (0:00:00.022) 0:03:04.041 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the keyless luks test] ****************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:383 Thursday 21 July 2022 18:01:13 +0000 (0:00:00.037) 0:03:04.078 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:388 Thursday 21 July 2022 18:01:13 +0000 (0:00:00.054) 0:03:04.132 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:01:13 +0000 (0:00:00.041) 0:03:04.174 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:01:13 +0000 (0:00:00.038) 0:03:04.212 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:01:14 +0000 (0:00:00.412) 0:03:04.624 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:01:14 +0000 (0:00:00.112) 0:03:04.737 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:01:14 +0000 (0:00:00.105) 0:03:04.842 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:01:14 +0000 (0:00:00.082) 0:03:04.925 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:01:14 +0000 (0:00:00.099) 0:03:05.025 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:01:14 +0000 (0:00:00.021) 0:03:05.046 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:01:15 +0000 (0:00:00.717) 0:03:05.764 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:01:15 +0000 (0:00:00.041) 0:03:05.805 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:01:15 +0000 (0:00:00.039) 0:03:05.845 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:01:16 +0000 (0:00:01.065) 0:03:06.911 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:01:16 +0000 (0:00:00.047) 0:03:06.958 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:01:16 +0000 (0:00:00.040) 0:03:06.999 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:01:16 +0000 (0:00:00.039) 0:03:07.038 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:01:16 +0000 (0:00:00.033) 0:03:07.071 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed" ] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:01:17 +0000 (0:00:00.532) 0:03:07.604 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:01:18 +0000 (0:00:00.993) 0:03:08.598 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:01:18 +0000 (0:00:00.063) 0:03:08.662 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:01:18 +0000 (0:00:00.024) 0:03:08.686 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1p1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/nvme1n1p1", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1p1", "name": "luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cb4d86dd-641f-4b08-abde-c73b587de633", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 18:01:26 +0000 (0:00:08.133) 0:03:16.819 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:01:26 +0000 (0:00:00.038) 0:03:16.858 ********* TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 18:01:26 +0000 (0:00:00.025) 0:03:16.884 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1p1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/nvme1n1p1", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/nvme1n1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/nvme1n1p1", "name": "luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cb4d86dd-641f-4b08-abde-c73b587de633", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 18:01:26 +0000 (0:00:00.042) 0:03:16.926 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 18:01:26 +0000 (0:00:00.042) 0:03:16.969 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 18:01:26 +0000 (0:00:00.042) 0:03:17.011 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 18:01:26 +0000 (0:00:00.364) 0:03:17.376 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 18:01:27 +0000 (0:00:00.505) 0:03:17.882 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 18:01:27 +0000 (0:00:00.366) 0:03:18.248 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 18:01:28 +0000 (0:00:00.506) 0:03:18.755 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426461.9387245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a6e92e31a838e5d2f2a4f9528467cb3c703e94c7", "ctime": 1658426458.9987245, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 20976594, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1658426458.9977245, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 88, "uid": 0, "version": "843013576", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 18:01:28 +0000 (0:00:00.325) 0:03:19.081 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'absent', 'password': '-', 'name': 'luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34', 'backing_device': '/dev/nvme1n1p1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/nvme1n1p1", "name": "luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [/cache/rhel-7.qcow2] => (item={'state': 'present', 'password': '-', 'name': 'luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'backing_device': '/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cb4d86dd-641f-4b08-abde-c73b587de633", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 18:01:29 +0000 (0:00:00.667) 0:03:19.748 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:407 Thursday 21 July 2022 18:01:30 +0000 (0:00:00.853) 0:03:20.601 ********* included: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:1 Thursday 21 July 2022 18:01:30 +0000 (0:00:00.040) 0:03:20.642 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "serpent-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:6 Thursday 21 July 2022 18:01:30 +0000 (0:00:00.050) 0:03:20.693 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:14 Thursday 21 July 2022 18:01:30 +0000 (0:00:00.035) 0:03:20.728 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "cb4d86dd-641f-4b08-abde-c73b587de633" }, "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "size": "4G", "type": "crypt", "uuid": "1efb59fc-232b-4834-8926-4d9f6ae9f611" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "LVM2_member", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "xAhHrq-vxRd-zzpO-c8hA-fKZ2-JwiL-Cq4NyT" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-17-57-57-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:19 Thursday 21 July 2022 18:01:30 +0000 (0:00:00.319) 0:03:21.047 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003517", "end": "2022-07-21 14:01:30.892825", "rc": 0, "start": "2022-07-21 14:01:30.889308" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 /dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:24 Thursday 21 July 2022 18:01:30 +0000 (0:00:00.307) 0:03:21.354 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003312", "end": "2022-07-21 14:01:31.201051", "failed_when_result": false, "rc": 0, "start": "2022-07-21 14:01:31.197739" } STDOUT: luks-cb4d86dd-641f-4b08-abde-c73b587de633 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:33 Thursday 21 July 2022 18:01:31 +0000 (0:00:00.307) 0:03:21.661 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml for /cache/rhel-7.qcow2 => (item={'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 512, 'encryption_cipher': 'serpent-xts-plain64', 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'lvm', 'encryption_cipher': None, 'raid_spare_count': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml:5 Thursday 21 July 2022 18:01:31 +0000 (0:00:00.056) 0:03:21.718 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml:18 Thursday 21 July 2022 18:01:31 +0000 (0:00:00.034) 0:03:21.753 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml for /cache/rhel-7.qcow2 => (item=members) included: /tmp/tmptomayb7j/tests/storage/test-verify-pool-volumes.yml for /cache/rhel-7.qcow2 => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:1 Thursday 21 July 2022 18:01:31 +0000 (0:00:00.047) 0:03:21.801 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/nvme1n1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:6 Thursday 21 July 2022 18:01:31 +0000 (0:00:00.056) 0:03:21.857 ********* ok: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/nvme1n1", "pv": "/dev/nvme1n1" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:15 Thursday 21 July 2022 18:01:31 +0000 (0:00:00.465) 0:03:22.323 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:19 Thursday 21 July 2022 18:01:31 +0000 (0:00:00.052) 0:03:22.375 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/nvme1n1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:23 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.048) 0:03:22.424 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:29 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.095) 0:03:22.519 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:33 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.039) 0:03:22.559 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:37 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.088) 0:03:22.648 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:41 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.023) 0:03:22.672 ********* ok: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/nvme1n1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:50 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.088) 0:03:22.760 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml for /cache/rhel-7.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:6 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.047) 0:03:22.808 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:12 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.025) 0:03:22.833 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:16 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.068) 0:03:22.901 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:20 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.023) 0:03:22.925 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:24 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.023) 0:03:22.948 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:30 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.024) 0:03:22.973 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:36 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.027) 0:03:23.000 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:44 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.026) 0:03:23.027 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:53 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.040) 0:03:23.068 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-lvmraid.yml for /cache/rhel-7.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.046) 0:03:23.114 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-member-lvmraid.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 512, 'encryption_cipher': 'serpent-xts-plain64', 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.045) 0:03:23.160 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.031) 0:03:23.191 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.030) 0:03:23.221 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:56 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.029) 0:03:23.251 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-thin.yml for /cache/rhel-7.qcow2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-thin.yml:1 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.048) 0:03:23.299 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 512, 'encryption_cipher': 'serpent-xts-plain64', 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml:3 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.047) 0:03:23.347 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml:8 Thursday 21 July 2022 18:01:32 +0000 (0:00:00.025) 0:03:23.373 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml:13 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.026) 0:03:23.399 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml:17 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.023) 0:03:23.423 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:59 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.025) 0:03:23.448 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml for /cache/rhel-7.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.048) 0:03:23.496 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.048) 0:03:23.545 ********* skipping: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "_storage_test_pool_member_path": "/dev/nvme1n1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.030) 0:03:23.576 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml for /cache/rhel-7.qcow2 => (item=/dev/nvme1n1) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.047) 0:03:23.624 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.052) 0:03:23.676 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.051) 0:03:23.728 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.041) 0:03:23.769 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.039) 0:03:23.809 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.036) 0:03:23.845 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.036) 0:03:23.882 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:62 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.037) 0:03:23.919 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-vdo.yml for /cache/rhel-7.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.049) 0:03:23.969 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 512, 'encryption_cipher': 'serpent-xts-plain64', 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.048) 0:03:24.017 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.026) 0:03:24.043 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.024) 0:03:24.067 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.025) 0:03:24.093 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.060) 0:03:24.154 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.024) 0:03:24.178 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.026) 0:03:24.205 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.026) 0:03:24.231 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:65 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.036) 0:03:24.268 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.034) 0:03:24.303 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 512, 'encryption_cipher': 'serpent-xts-plain64', 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': [], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:2 Thursday 21 July 2022 18:01:33 +0000 (0:00:00.045) 0:03:24.348 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:10 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.052) 0:03:24.400 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:6 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.086) 0:03:24.487 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:10 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.041) 0:03:24.528 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1037256, "block_size": 4096, "block_total": 1045504, "block_used": 8248, "device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 4248600576, "size_total": 4282384384, "uuid": "1efb59fc-232b-4834-8926-4d9f6ae9f611" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1037256, "block_size": 4096, "block_total": 1045504, "block_used": 8248, "device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 4248600576, "size_total": 4282384384, "uuid": "1efb59fc-232b-4834-8926-4d9f6ae9f611" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:20 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.060) 0:03:24.589 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:29 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.054) 0:03:24.643 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:37 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.049) 0:03:24.693 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:46 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.055) 0:03:24.748 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:50 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.026) 0:03:24.774 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:55 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.023) 0:03:24.797 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:65 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.022) 0:03:24.820 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.038) 0:03:24.858 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:12 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.063) 0:03:24.922 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:19 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.051) 0:03:24.974 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.054) 0:03:25.028 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:34 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.042) 0:03:25.070 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:4 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.037) 0:03:25.108 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:10 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.042) 0:03:25.151 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:4 Thursday 21 July 2022 18:01:34 +0000 (0:00:00.043) 0:03:25.194 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426486.1447246, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426486.1447246, "dev": 5, "device_type": 64512, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 86618, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658426486.1447246, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:10 Thursday 21 July 2022 18:01:35 +0000 (0:00:00.328) 0:03:25.523 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:15 Thursday 21 July 2022 18:01:35 +0000 (0:00:00.041) 0:03:25.565 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:21 Thursday 21 July 2022 18:01:35 +0000 (0:00:00.040) 0:03:25.605 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:25 Thursday 21 July 2022 18:01:35 +0000 (0:00:00.043) 0:03:25.649 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:30 Thursday 21 July 2022 18:01:35 +0000 (0:00:00.030) 0:03:25.680 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 18:01:35 +0000 (0:00:00.041) 0:03:25.721 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426486.3037245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426486.3037245, "dev": 5, "device_type": 64513, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 87512, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658426486.3037245, "nlink": 1, "path": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 18:01:35 +0000 (0:00:00.314) 0:03:26.036 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 18:01:36 +0000 (0:00:00.558) 0:03:26.594 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.034682", "end": "2022-07-21 14:01:36.477828", "rc": 0, "start": "2022-07-21 14:01:36.443146" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: serpent Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 0b 23 59 4e a1 99 09 9c 11 4d 88 c4 af d5 85 ff dc 01 bb 4c MK salt: 4b 34 73 2f ef 56 a3 02 b0 52 b8 77 e4 84 05 48 d9 51 ae 90 9a 59 00 41 c8 ae 6b e0 91 d6 f9 b3 MK iterations: 23043 UUID: cb4d86dd-641f-4b08-abde-c73b587de633 Key Slot 0: ENABLED Iterations: 365612 Salt: 65 66 ab 84 41 c3 38 f0 37 09 f9 b3 75 92 18 61 0f a3 4c 5a 34 e0 d0 82 62 a6 74 14 e0 e6 23 8f Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 18:01:36 +0000 (0:00:00.345) 0:03:26.940 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:27 Thursday 21 July 2022 18:01:36 +0000 (0:00:00.039) 0:03:26.979 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:33 Thursday 21 July 2022 18:01:36 +0000 (0:00:00.098) 0:03:27.078 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:39 Thursday 21 July 2022 18:01:36 +0000 (0:00:00.087) 0:03:27.165 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 18:01:36 +0000 (0:00:00.039) 0:03:27.205 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:50 Thursday 21 July 2022 18:01:36 +0000 (0:00:00.110) 0:03:27.315 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:56 Thursday 21 July 2022 18:01:37 +0000 (0:00:00.174) 0:03:27.490 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:62 Thursday 21 July 2022 18:01:37 +0000 (0:00:00.055) 0:03:27.546 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-cb4d86dd-641f-4b08-abde-c73b587de633 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 18:01:37 +0000 (0:00:00.054) 0:03:27.600 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:72 Thursday 21 July 2022 18:01:37 +0000 (0:00:00.046) 0:03:27.647 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:78 Thursday 21 July 2022 18:01:37 +0000 (0:00:00.050) 0:03:27.698 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:84 Thursday 21 July 2022 18:01:37 +0000 (0:00:00.052) 0:03:27.750 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:90 Thursday 21 July 2022 18:01:37 +0000 (0:00:00.052) 0:03:27.803 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:7 Thursday 21 July 2022 18:01:37 +0000 (0:00:00.035) 0:03:27.839 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:13 Thursday 21 July 2022 18:01:37 +0000 (0:00:00.038) 0:03:27.877 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:17 Thursday 21 July 2022 18:01:37 +0000 (0:00:00.038) 0:03:27.916 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:21 Thursday 21 July 2022 18:01:37 +0000 (0:00:00.037) 0:03:27.953 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:25 Thursday 21 July 2022 18:01:37 +0000 (0:00:00.037) 0:03:27.991 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:31 Thursday 21 July 2022 18:01:37 +0000 (0:00:00.039) 0:03:28.030 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:37 Thursday 21 July 2022 18:01:37 +0000 (0:00:00.040) 0:03:28.071 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:3 Thursday 21 July 2022 18:01:37 +0000 (0:00:00.039) 0:03:28.110 ********* ok: [/cache/rhel-7.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:9 Thursday 21 July 2022 18:01:38 +0000 (0:00:00.456) 0:03:28.566 ********* ok: [/cache/rhel-7.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:15 Thursday 21 July 2022 18:01:38 +0000 (0:00:00.331) 0:03:28.898 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:20 Thursday 21 July 2022 18:01:38 +0000 (0:00:00.053) 0:03:28.951 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:25 Thursday 21 July 2022 18:01:38 +0000 (0:00:00.038) 0:03:28.990 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:28 Thursday 21 July 2022 18:01:38 +0000 (0:00:00.038) 0:03:29.028 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:31 Thursday 21 July 2022 18:01:38 +0000 (0:00:00.038) 0:03:29.067 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:36 Thursday 21 July 2022 18:01:38 +0000 (0:00:00.037) 0:03:29.104 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:39 Thursday 21 July 2022 18:01:38 +0000 (0:00:00.038) 0:03:29.142 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:44 Thursday 21 July 2022 18:01:38 +0000 (0:00:00.039) 0:03:29.181 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:47 Thursday 21 July 2022 18:01:38 +0000 (0:00:00.037) 0:03:29.219 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:50 Thursday 21 July 2022 18:01:38 +0000 (0:00:00.035) 0:03:29.254 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:6 Thursday 21 July 2022 18:01:38 +0000 (0:00:00.050) 0:03:29.304 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.022014", "end": "2022-07-21 14:01:39.177280", "rc": 0, "start": "2022-07-21 14:01:39.155266" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:14 Thursday 21 July 2022 18:01:39 +0000 (0:00:00.334) 0:03:29.638 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:17 Thursday 21 July 2022 18:01:39 +0000 (0:00:00.053) 0:03:29.692 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:22 Thursday 21 July 2022 18:01:39 +0000 (0:00:00.095) 0:03:29.788 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:26 Thursday 21 July 2022 18:01:39 +0000 (0:00:00.041) 0:03:29.829 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:32 Thursday 21 July 2022 18:01:39 +0000 (0:00:00.035) 0:03:29.864 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:36 Thursday 21 July 2022 18:01:39 +0000 (0:00:00.034) 0:03:29.899 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:16 Thursday 21 July 2022 18:01:39 +0000 (0:00:00.034) 0:03:29.934 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:43 Thursday 21 July 2022 18:01:39 +0000 (0:00:00.037) 0:03:29.971 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:53 Thursday 21 July 2022 18:01:39 +0000 (0:00:00.022) 0:03:29.994 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:409 Thursday 21 July 2022 18:01:39 +0000 (0:00:00.074) 0:03:30.069 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:01:39 +0000 (0:00:00.080) 0:03:30.150 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:01:39 +0000 (0:00:00.036) 0:03:30.186 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:01:40 +0000 (0:00:00.400) 0:03:30.587 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:01:40 +0000 (0:00:00.065) 0:03:30.652 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:01:40 +0000 (0:00:00.040) 0:03:30.693 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:01:40 +0000 (0:00:00.037) 0:03:30.730 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:01:40 +0000 (0:00:00.058) 0:03:30.788 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:01:40 +0000 (0:00:00.020) 0:03:30.809 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:01:41 +0000 (0:00:00.720) 0:03:31.530 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:01:41 +0000 (0:00:00.042) 0:03:31.573 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:01:41 +0000 (0:00:00.040) 0:03:31.613 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:01:42 +0000 (0:00:01.188) 0:03:32.801 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:01:42 +0000 (0:00:00.049) 0:03:32.851 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:01:42 +0000 (0:00:00.037) 0:03:32.888 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:01:42 +0000 (0:00:00.044) 0:03:32.933 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:01:42 +0000 (0:00:00.034) 0:03:32.967 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed" ] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:01:43 +0000 (0:00:00.546) 0:03:33.513 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@259:1.service": { "name": "lvm2-pvscan@259:1.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d6192fdd9\\x2df7eb\\x2d4e8a\\x2daed8\\x2dd86682e15c34.service": { "name": "systemd-cryptsetup@luks\\x2d6192fdd9\\x2df7eb\\x2d4e8a\\x2daed8\\x2dd86682e15c34.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:01:44 +0000 (0:00:01.002) 0:03:34.515 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d6192fdd9\\x2df7eb\\x2d4e8a\\x2daed8\\x2dd86682e15c34.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:01:44 +0000 (0:00:00.056) 0:03:34.572 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2d6192fdd9\x2df7eb\x2d4e8a\x2daed8\x2dd86682e15c34.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d6192fdd9\\x2df7eb\\x2d4e8a\\x2daed8\\x2dd86682e15c34.service", "name": "systemd-cryptsetup@luks\\x2d6192fdd9\\x2df7eb\\x2d4e8a\\x2daed8\\x2dd86682e15c34.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target systemd-journald.socket tmp.mount systemd-readahead-collect.service dev-nvme1n1p1.device -.mount system-systemd\\x2dcryptsetup.slice systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-nvme1n1p1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34 /dev/nvme1n1p1 /tmp/storage_testkPLDKVlukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-6192fdd9-f7eb-4e8a-aed8-d86682e15c34 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d6192fdd9\\x2df7eb\\x2d4e8a\\x2daed8\\x2dd86682e15c34.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d6192fdd9\\x2df7eb\\x2d4e8a\\x2daed8\\x2dd86682e15c34.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d6192fdd9\\x2df7eb\\x2d4e8a\\x2daed8\\x2dd86682e15c34.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice -.mount", "RequiresMountsFor": "/tmp/storage_testkPLDKVlukskey", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-nvme1n1p1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:01:44 +0000 (0:00:00.522) 0:03:35.095 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 18:01:45 +0000 (0:00:01.206) 0:03:36.302 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:01:45 +0000 (0:00:00.036) 0:03:36.339 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2d6192fdd9\x2df7eb\x2d4e8a\x2daed8\x2dd86682e15c34.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d6192fdd9\\x2df7eb\\x2d4e8a\\x2daed8\\x2dd86682e15c34.service", "name": "systemd-cryptsetup@luks\\x2d6192fdd9\\x2df7eb\\x2d4e8a\\x2daed8\\x2dd86682e15c34.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d6192fdd9\\x2df7eb\\x2d4e8a\\x2daed8\\x2dd86682e15c34.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d6192fdd9\\x2df7eb\\x2d4e8a\\x2daed8\\x2dd86682e15c34.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d6192fdd9\\x2df7eb\\x2d4e8a\\x2daed8\\x2dd86682e15c34.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 18:01:46 +0000 (0:00:00.523) 0:03:36.862 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 18:01:46 +0000 (0:00:00.042) 0:03:36.905 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 18:01:46 +0000 (0:00:00.042) 0:03:36.947 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 18:01:46 +0000 (0:00:00.039) 0:03:36.987 ********* TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 18:01:46 +0000 (0:00:00.036) 0:03:37.023 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 18:01:47 +0000 (0:00:00.453) 0:03:37.476 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount ok: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 18:01:47 +0000 (0:00:00.338) 0:03:37.815 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 18:01:47 +0000 (0:00:00.452) 0:03:38.268 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426491.1997247, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a86b16a80361bf068296cc00968149b78063d7e6", "ctime": 1658426489.2837245, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 8521544, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1658426489.2837245, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1690789586", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 18:01:48 +0000 (0:00:00.316) 0:03:38.584 ********* TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 18:01:48 +0000 (0:00:00.022) 0:03:38.607 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:423 Thursday 21 July 2022 18:01:49 +0000 (0:00:00.842) 0:03:39.450 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:427 Thursday 21 July 2022 18:01:49 +0000 (0:00:00.041) 0:03:39.491 ********* included: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:1 Thursday 21 July 2022 18:01:49 +0000 (0:00:00.042) 0:03:39.533 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:6 Thursday 21 July 2022 18:01:49 +0000 (0:00:00.050) 0:03:39.584 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:14 Thursday 21 July 2022 18:01:49 +0000 (0:00:00.037) 0:03:39.621 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "cb4d86dd-641f-4b08-abde-c73b587de633" }, "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "size": "4G", "type": "crypt", "uuid": "1efb59fc-232b-4834-8926-4d9f6ae9f611" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "LVM2_member", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "xAhHrq-vxRd-zzpO-c8hA-fKZ2-JwiL-Cq4NyT" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-17-57-57-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:19 Thursday 21 July 2022 18:01:49 +0000 (0:00:00.326) 0:03:39.948 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003919", "end": "2022-07-21 14:01:49.803491", "rc": 0, "start": "2022-07-21 14:01:49.799572" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 /dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:24 Thursday 21 July 2022 18:01:49 +0000 (0:00:00.319) 0:03:40.268 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003524", "end": "2022-07-21 14:01:50.125247", "failed_when_result": false, "rc": 0, "start": "2022-07-21 14:01:50.121723" } STDOUT: luks-cb4d86dd-641f-4b08-abde-c73b587de633 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:33 Thursday 21 July 2022 18:01:50 +0000 (0:00:00.319) 0:03:40.587 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml for /cache/rhel-7.qcow2 => (item={'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'lvm', 'encryption_cipher': None, 'raid_spare_count': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml:5 Thursday 21 July 2022 18:01:50 +0000 (0:00:00.061) 0:03:40.648 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml:18 Thursday 21 July 2022 18:01:50 +0000 (0:00:00.036) 0:03:40.685 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml for /cache/rhel-7.qcow2 => (item=members) included: /tmp/tmptomayb7j/tests/storage/test-verify-pool-volumes.yml for /cache/rhel-7.qcow2 => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:1 Thursday 21 July 2022 18:01:50 +0000 (0:00:00.062) 0:03:40.748 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/nvme1n1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:6 Thursday 21 July 2022 18:01:50 +0000 (0:00:00.114) 0:03:40.862 ********* ok: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/nvme1n1", "pv": "/dev/nvme1n1" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:15 Thursday 21 July 2022 18:01:50 +0000 (0:00:00.334) 0:03:41.197 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:19 Thursday 21 July 2022 18:01:50 +0000 (0:00:00.056) 0:03:41.253 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/nvme1n1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:23 Thursday 21 July 2022 18:01:50 +0000 (0:00:00.054) 0:03:41.308 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:29 Thursday 21 July 2022 18:01:50 +0000 (0:00:00.063) 0:03:41.371 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:33 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.039) 0:03:41.411 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:37 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.054) 0:03:41.465 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:41 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.025) 0:03:41.491 ********* ok: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/nvme1n1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:50 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.047) 0:03:41.539 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml for /cache/rhel-7.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:6 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.046) 0:03:41.586 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:12 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.026) 0:03:41.613 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:16 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.026) 0:03:41.639 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:20 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.027) 0:03:41.667 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:24 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.026) 0:03:41.693 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:30 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.025) 0:03:41.719 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:36 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.030) 0:03:41.749 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:44 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.028) 0:03:41.778 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:53 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.038) 0:03:41.816 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-lvmraid.yml for /cache/rhel-7.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.046) 0:03:41.863 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-member-lvmraid.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.048) 0:03:41.911 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.031) 0:03:41.943 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.029) 0:03:41.972 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:56 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.031) 0:03:42.004 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-thin.yml for /cache/rhel-7.qcow2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-thin.yml:1 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.047) 0:03:42.052 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml:3 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.046) 0:03:42.098 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml:8 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.024) 0:03:42.122 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml:13 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.026) 0:03:42.149 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml:17 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.023) 0:03:42.172 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:59 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.024) 0:03:42.197 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml for /cache/rhel-7.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.045) 0:03:42.242 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 18:01:51 +0000 (0:00:00.141) 0:03:42.384 ********* skipping: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "_storage_test_pool_member_path": "/dev/nvme1n1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.029) 0:03:42.413 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml for /cache/rhel-7.qcow2 => (item=/dev/nvme1n1) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.044) 0:03:42.458 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.054) 0:03:42.512 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.048) 0:03:42.561 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.037) 0:03:42.598 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.038) 0:03:42.637 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.041) 0:03:42.679 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.036) 0:03:42.715 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:62 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.035) 0:03:42.751 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-vdo.yml for /cache/rhel-7.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.051) 0:03:42.803 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.046) 0:03:42.849 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.025) 0:03:42.874 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.023) 0:03:42.897 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.022) 0:03:42.920 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.022) 0:03:42.943 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.024) 0:03:42.967 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.022) 0:03:42.990 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.023) 0:03:43.014 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:65 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.035) 0:03:43.049 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.040) 0:03:43.090 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': None, '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:2 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.043) 0:03:43.133 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:10 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.049) 0:03:43.183 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:6 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.080) 0:03:43.264 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:10 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.043) 0:03:43.307 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1037256, "block_size": 4096, "block_total": 1045504, "block_used": 8248, "device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 4248600576, "size_total": 4282384384, "uuid": "1efb59fc-232b-4834-8926-4d9f6ae9f611" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1037256, "block_size": 4096, "block_total": 1045504, "block_used": 8248, "device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 4248600576, "size_total": 4282384384, "uuid": "1efb59fc-232b-4834-8926-4d9f6ae9f611" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:20 Thursday 21 July 2022 18:01:52 +0000 (0:00:00.058) 0:03:43.366 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:29 Thursday 21 July 2022 18:01:53 +0000 (0:00:00.053) 0:03:43.419 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:37 Thursday 21 July 2022 18:01:53 +0000 (0:00:00.050) 0:03:43.470 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:46 Thursday 21 July 2022 18:01:53 +0000 (0:00:00.049) 0:03:43.519 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:50 Thursday 21 July 2022 18:01:53 +0000 (0:00:00.022) 0:03:43.542 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:55 Thursday 21 July 2022 18:01:53 +0000 (0:00:00.023) 0:03:43.565 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:65 Thursday 21 July 2022 18:01:53 +0000 (0:00:00.023) 0:03:43.589 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 18:01:53 +0000 (0:00:00.079) 0:03:43.668 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:12 Thursday 21 July 2022 18:01:53 +0000 (0:00:00.146) 0:03:43.814 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:19 Thursday 21 July 2022 18:01:53 +0000 (0:00:00.051) 0:03:43.865 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 18:01:53 +0000 (0:00:00.051) 0:03:43.917 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:34 Thursday 21 July 2022 18:01:53 +0000 (0:00:00.038) 0:03:43.956 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:4 Thursday 21 July 2022 18:01:53 +0000 (0:00:00.034) 0:03:43.991 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:10 Thursday 21 July 2022 18:01:53 +0000 (0:00:00.039) 0:03:44.030 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:4 Thursday 21 July 2022 18:01:53 +0000 (0:00:00.041) 0:03:44.072 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426496.4667246, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426486.1447246, "dev": 5, "device_type": 64512, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 86618, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658426486.1447246, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:10 Thursday 21 July 2022 18:01:53 +0000 (0:00:00.305) 0:03:44.378 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:15 Thursday 21 July 2022 18:01:54 +0000 (0:00:00.038) 0:03:44.417 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:21 Thursday 21 July 2022 18:01:54 +0000 (0:00:00.039) 0:03:44.456 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:25 Thursday 21 July 2022 18:01:54 +0000 (0:00:00.040) 0:03:44.496 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:30 Thursday 21 July 2022 18:01:54 +0000 (0:00:00.024) 0:03:44.521 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 18:01:54 +0000 (0:00:00.040) 0:03:44.562 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426486.3037245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426486.3037245, "dev": 5, "device_type": 64513, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 87512, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658426486.3037245, "nlink": 1, "path": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 18:01:54 +0000 (0:00:00.329) 0:03:44.891 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 18:01:54 +0000 (0:00:00.500) 0:03:45.392 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.035192", "end": "2022-07-21 14:01:55.275461", "rc": 0, "start": "2022-07-21 14:01:55.240269" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: serpent Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 0b 23 59 4e a1 99 09 9c 11 4d 88 c4 af d5 85 ff dc 01 bb 4c MK salt: 4b 34 73 2f ef 56 a3 02 b0 52 b8 77 e4 84 05 48 d9 51 ae 90 9a 59 00 41 c8 ae 6b e0 91 d6 f9 b3 MK iterations: 23043 UUID: cb4d86dd-641f-4b08-abde-c73b587de633 Key Slot 0: ENABLED Iterations: 365612 Salt: 65 66 ab 84 41 c3 38 f0 37 09 f9 b3 75 92 18 61 0f a3 4c 5a 34 e0 d0 82 62 a6 74 14 e0 e6 23 8f Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 18:01:55 +0000 (0:00:00.347) 0:03:45.739 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:27 Thursday 21 July 2022 18:01:55 +0000 (0:00:00.041) 0:03:45.780 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:33 Thursday 21 July 2022 18:01:55 +0000 (0:00:00.055) 0:03:45.836 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:39 Thursday 21 July 2022 18:01:55 +0000 (0:00:00.046) 0:03:45.882 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 18:01:55 +0000 (0:00:00.042) 0:03:45.925 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:50 Thursday 21 July 2022 18:01:55 +0000 (0:00:00.054) 0:03:45.979 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:56 Thursday 21 July 2022 18:01:55 +0000 (0:00:00.027) 0:03:46.006 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:62 Thursday 21 July 2022 18:01:55 +0000 (0:00:00.025) 0:03:46.032 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-cb4d86dd-641f-4b08-abde-c73b587de633 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 18:01:55 +0000 (0:00:00.055) 0:03:46.087 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:72 Thursday 21 July 2022 18:01:55 +0000 (0:00:00.053) 0:03:46.140 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:78 Thursday 21 July 2022 18:01:55 +0000 (0:00:00.051) 0:03:46.192 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:84 Thursday 21 July 2022 18:01:55 +0000 (0:00:00.095) 0:03:46.287 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:90 Thursday 21 July 2022 18:01:55 +0000 (0:00:00.061) 0:03:46.349 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:7 Thursday 21 July 2022 18:01:55 +0000 (0:00:00.039) 0:03:46.388 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:13 Thursday 21 July 2022 18:01:56 +0000 (0:00:00.040) 0:03:46.429 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:17 Thursday 21 July 2022 18:01:56 +0000 (0:00:00.040) 0:03:46.469 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:21 Thursday 21 July 2022 18:01:56 +0000 (0:00:00.043) 0:03:46.513 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:25 Thursday 21 July 2022 18:01:56 +0000 (0:00:00.037) 0:03:46.550 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:31 Thursday 21 July 2022 18:01:56 +0000 (0:00:00.039) 0:03:46.590 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:37 Thursday 21 July 2022 18:01:56 +0000 (0:00:00.038) 0:03:46.628 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:3 Thursday 21 July 2022 18:01:56 +0000 (0:00:00.043) 0:03:46.672 ********* ok: [/cache/rhel-7.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:9 Thursday 21 July 2022 18:01:56 +0000 (0:00:00.309) 0:03:46.981 ********* ok: [/cache/rhel-7.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:15 Thursday 21 July 2022 18:01:56 +0000 (0:00:00.377) 0:03:47.358 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:20 Thursday 21 July 2022 18:01:57 +0000 (0:00:00.052) 0:03:47.411 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:25 Thursday 21 July 2022 18:01:57 +0000 (0:00:00.040) 0:03:47.452 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:28 Thursday 21 July 2022 18:01:57 +0000 (0:00:00.039) 0:03:47.491 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:31 Thursday 21 July 2022 18:01:57 +0000 (0:00:00.036) 0:03:47.528 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:36 Thursday 21 July 2022 18:01:57 +0000 (0:00:00.039) 0:03:47.567 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:39 Thursday 21 July 2022 18:01:57 +0000 (0:00:00.041) 0:03:47.608 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:44 Thursday 21 July 2022 18:01:57 +0000 (0:00:00.037) 0:03:47.646 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:47 Thursday 21 July 2022 18:01:57 +0000 (0:00:00.041) 0:03:47.687 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:50 Thursday 21 July 2022 18:01:57 +0000 (0:00:00.038) 0:03:47.726 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:6 Thursday 21 July 2022 18:01:57 +0000 (0:00:00.056) 0:03:47.783 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023474", "end": "2022-07-21 14:01:57.663250", "rc": 0, "start": "2022-07-21 14:01:57.639776" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:14 Thursday 21 July 2022 18:01:57 +0000 (0:00:00.342) 0:03:48.126 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:17 Thursday 21 July 2022 18:01:57 +0000 (0:00:00.053) 0:03:48.179 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:22 Thursday 21 July 2022 18:01:57 +0000 (0:00:00.054) 0:03:48.234 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:26 Thursday 21 July 2022 18:01:57 +0000 (0:00:00.040) 0:03:48.274 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:32 Thursday 21 July 2022 18:01:57 +0000 (0:00:00.040) 0:03:48.315 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:36 Thursday 21 July 2022 18:01:57 +0000 (0:00:00.038) 0:03:48.354 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:16 Thursday 21 July 2022 18:01:57 +0000 (0:00:00.039) 0:03:48.393 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:43 Thursday 21 July 2022 18:01:58 +0000 (0:00:00.034) 0:03:48.428 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:53 Thursday 21 July 2022 18:01:58 +0000 (0:00:00.020) 0:03:48.449 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/create-test-file.yml:10 Thursday 21 July 2022 18:01:58 +0000 (0:00:00.037) 0:03:48.487 ********* changed: [/cache/rhel-7.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:433 Thursday 21 July 2022 18:01:58 +0000 (0:00:00.344) 0:03:48.831 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:01:58 +0000 (0:00:00.040) 0:03:48.872 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:01:58 +0000 (0:00:00.035) 0:03:48.908 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:01:58 +0000 (0:00:00.412) 0:03:49.321 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:01:59 +0000 (0:00:00.107) 0:03:49.429 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:01:59 +0000 (0:00:00.035) 0:03:49.464 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:01:59 +0000 (0:00:00.084) 0:03:49.549 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:01:59 +0000 (0:00:00.096) 0:03:49.645 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:01:59 +0000 (0:00:00.021) 0:03:49.667 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:01:59 +0000 (0:00:00.720) 0:03:50.387 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:02:00 +0000 (0:00:00.042) 0:03:50.429 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:02:00 +0000 (0:00:00.052) 0:03:50.482 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:02:01 +0000 (0:00:01.148) 0:03:51.630 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:02:01 +0000 (0:00:00.050) 0:03:51.681 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:02:01 +0000 (0:00:00.038) 0:03:51.720 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:02:01 +0000 (0:00:00.039) 0:03:51.759 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:02:01 +0000 (0:00:00.033) 0:03:51.793 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed" ] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:02:01 +0000 (0:00:00.567) 0:03:52.360 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@259:1.service": { "name": "lvm2-pvscan@259:1.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service": { "name": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:02:02 +0000 (0:00:00.998) 0:03:53.358 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:02:03 +0000 (0:00:00.060) 0:03:53.419 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2dcb4d86dd\x2d641f\x2d4b08\x2dabde\x2dc73b587de633.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "name": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service systemd-readahead-replay.service cryptsetup-pre.target dev-mapper-foo\\x2dtest1.device systemd-journald.socket system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-cb4d86dd-641f-4b08-abde-c73b587de633", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-cb4d86dd-641f-4b08-abde-c73b587de633 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-cb4d86dd-641f-4b08-abde-c73b587de633 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:02:03 +0000 (0:00:00.482) 0:03:53.901 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-cb4d86dd-641f-4b08-abde-c73b587de633' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : failed message] ********************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:86 Thursday 21 July 2022 18:02:04 +0000 (0:00:01.190) 0:03:55.092 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': True, 'pools': [{'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, 'raid_spare_count': None, 'raid_disks': [], 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'lvm', 'encryption_cipher': None, 'raid_spare_count': None}], 'volumes': [], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "cannot remove existing formatting on device 'luks-cb4d86dd-641f-4b08-abde-c73b587de633' in safe mode due to encryption removal", '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:02:04 +0000 (0:00:00.044) 0:03:55.136 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2dcb4d86dd\x2d641f\x2d4b08\x2dabde\x2dc73b587de633.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "name": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.device", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:453 Thursday 21 July 2022 18:02:05 +0000 (0:00:00.483) 0:03:55.620 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:459 Thursday 21 July 2022 18:02:05 +0000 (0:00:00.041) 0:03:55.661 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml:10 Thursday 21 July 2022 18:02:05 +0000 (0:00:00.051) 0:03:55.713 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426518.3627245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658426518.3627245, "dev": 64513, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1658426518.3627245, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744072701663300", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml:15 Thursday 21 July 2022 18:02:05 +0000 (0:00:00.349) 0:03:56.062 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:466 Thursday 21 July 2022 18:02:05 +0000 (0:00:00.086) 0:03:56.149 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:02:05 +0000 (0:00:00.093) 0:03:56.242 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:02:05 +0000 (0:00:00.035) 0:03:56.278 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:02:06 +0000 (0:00:00.422) 0:03:56.701 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:02:06 +0000 (0:00:00.069) 0:03:56.770 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:02:06 +0000 (0:00:00.039) 0:03:56.809 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:02:06 +0000 (0:00:00.039) 0:03:56.848 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:02:06 +0000 (0:00:00.060) 0:03:56.909 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:02:06 +0000 (0:00:00.021) 0:03:56.930 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:02:07 +0000 (0:00:00.654) 0:03:57.585 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:02:07 +0000 (0:00:00.040) 0:03:57.626 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:02:07 +0000 (0:00:00.039) 0:03:57.666 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:02:08 +0000 (0:00:01.115) 0:03:58.781 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:02:08 +0000 (0:00:00.047) 0:03:58.829 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:02:08 +0000 (0:00:00.034) 0:03:58.863 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:02:08 +0000 (0:00:00.037) 0:03:58.901 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:02:08 +0000 (0:00:00.035) 0:03:58.937 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed" ] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:02:09 +0000 (0:00:00.534) 0:03:59.472 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@259:1.service": { "name": "lvm2-pvscan@259:1.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service": { "name": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:02:10 +0000 (0:00:01.016) 0:04:00.488 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:02:10 +0000 (0:00:00.119) 0:04:00.607 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2dcb4d86dd\x2d641f\x2d4b08\x2dabde\x2dc73b587de633.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "name": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket systemd-readahead-collect.service dev-mapper-foo\\x2dtest1.device systemd-readahead-replay.service system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-cb4d86dd-641f-4b08-abde-c73b587de633", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-cb4d86dd-641f-4b08-abde-c73b587de633 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-cb4d86dd-641f-4b08-abde-c73b587de633 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "dev-mapper-luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:02:10 +0000 (0:00:00.576) 0:04:01.184 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cb4d86dd-641f-4b08-abde-c73b587de633", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/mapper/foo-test1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 18:03:12 +0000 (0:01:01.962) 0:05:03.147 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:03:12 +0000 (0:00:00.042) 0:05:03.190 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2dcb4d86dd\x2d641f\x2d4b08\x2dabde\x2dc73b587de633.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "name": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "dev-mapper-luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 18:03:13 +0000 (0:00:00.501) 0:05:03.692 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cb4d86dd-641f-4b08-abde-c73b587de633", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/mapper/foo-test1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 18:03:13 +0000 (0:00:00.044) 0:05:03.736 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 18:03:13 +0000 (0:00:00.044) 0:05:03.781 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 18:03:13 +0000 (0:00:00.043) 0:05:03.825 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-cb4d86dd-641f-4b08-abde-c73b587de633" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 18:03:13 +0000 (0:00:00.376) 0:05:04.201 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 18:03:14 +0000 (0:00:00.467) 0:05:04.668 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/foo-test1', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 18:03:14 +0000 (0:00:00.371) 0:05:05.040 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 18:03:15 +0000 (0:00:00.463) 0:05:05.503 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426491.1997247, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a86b16a80361bf068296cc00968149b78063d7e6", "ctime": 1658426489.2837245, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 8521544, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1658426489.2837245, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1690789586", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 18:03:15 +0000 (0:00:00.322) 0:05:05.825 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'absent', 'password': '-', 'name': 'luks-cb4d86dd-641f-4b08-abde-c73b587de633', 'backing_device': '/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cb4d86dd-641f-4b08-abde-c73b587de633", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 18:03:15 +0000 (0:00:00.392) 0:05:06.218 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:482 Thursday 21 July 2022 18:03:16 +0000 (0:00:00.953) 0:05:07.171 ********* included: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:1 Thursday 21 July 2022 18:03:16 +0000 (0:00:00.039) 0:05:07.211 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:6 Thursday 21 July 2022 18:03:16 +0000 (0:00:00.052) 0:05:07.263 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:14 Thursday 21 July 2022 18:03:16 +0000 (0:00:00.040) 0:05:07.304 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "1658ae73-9fdb-427e-a87c-e928d66979a6" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "LVM2_member", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "xAhHrq-vxRd-zzpO-c8hA-fKZ2-JwiL-Cq4NyT" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-17-57-57-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:19 Thursday 21 July 2022 18:03:17 +0000 (0:00:00.324) 0:05:07.629 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003624", "end": "2022-07-21 14:03:17.477677", "rc": 0, "start": "2022-07-21 14:03:17.474053" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:24 Thursday 21 July 2022 18:03:17 +0000 (0:00:00.313) 0:05:07.943 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003574", "end": "2022-07-21 14:03:17.790935", "failed_when_result": false, "rc": 0, "start": "2022-07-21 14:03:17.787361" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:33 Thursday 21 July 2022 18:03:17 +0000 (0:00:00.314) 0:05:08.257 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml for /cache/rhel-7.qcow2 => (item={'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/foo-test1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/foo-test1', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'lvm', 'encryption_cipher': None, 'raid_spare_count': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml:5 Thursday 21 July 2022 18:03:17 +0000 (0:00:00.061) 0:05:08.319 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml:18 Thursday 21 July 2022 18:03:17 +0000 (0:00:00.032) 0:05:08.352 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml for /cache/rhel-7.qcow2 => (item=members) included: /tmp/tmptomayb7j/tests/storage/test-verify-pool-volumes.yml for /cache/rhel-7.qcow2 => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:1 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.045) 0:05:08.397 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/nvme1n1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:6 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.052) 0:05:08.449 ********* ok: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/nvme1n1", "pv": "/dev/nvme1n1" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:15 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.321) 0:05:08.771 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:19 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.050) 0:05:08.821 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/nvme1n1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:23 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.050) 0:05:08.871 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:29 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.051) 0:05:08.923 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:33 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.037) 0:05:08.961 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:37 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.047) 0:05:09.008 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:41 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.022) 0:05:09.030 ********* ok: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/nvme1n1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:50 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.044) 0:05:09.075 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml for /cache/rhel-7.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:6 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.053) 0:05:09.128 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:12 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.026) 0:05:09.155 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:16 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.022) 0:05:09.178 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:20 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.022) 0:05:09.200 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:24 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.021) 0:05:09.222 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:30 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.022) 0:05:09.244 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:36 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.021) 0:05:09.266 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:44 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.022) 0:05:09.288 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:53 Thursday 21 July 2022 18:03:18 +0000 (0:00:00.092) 0:05:09.381 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-lvmraid.yml for /cache/rhel-7.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.042) 0:05:09.423 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-member-lvmraid.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/foo-test1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/foo-test1', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.044) 0:05:09.468 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.029) 0:05:09.498 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.028) 0:05:09.526 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:56 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.028) 0:05:09.555 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-thin.yml for /cache/rhel-7.qcow2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-thin.yml:1 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.042) 0:05:09.597 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/foo-test1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/foo-test1', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml:3 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.043) 0:05:09.641 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml:8 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.022) 0:05:09.664 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml:13 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.023) 0:05:09.687 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml:17 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.023) 0:05:09.710 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:59 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.024) 0:05:09.735 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml for /cache/rhel-7.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.046) 0:05:09.782 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.052) 0:05:09.834 ********* skipping: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "_storage_test_pool_member_path": "/dev/nvme1n1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.028) 0:05:09.863 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml for /cache/rhel-7.qcow2 => (item=/dev/nvme1n1) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.042) 0:05:09.906 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.051) 0:05:09.957 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.049) 0:05:10.006 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.036) 0:05:10.043 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.036) 0:05:10.080 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.042) 0:05:10.123 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.034) 0:05:10.158 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:62 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.035) 0:05:10.194 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-vdo.yml for /cache/rhel-7.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.048) 0:05:10.242 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/foo-test1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/foo-test1', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.046) 0:05:10.289 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.026) 0:05:10.315 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.026) 0:05:10.342 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.025) 0:05:10.367 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 18:03:19 +0000 (0:00:00.026) 0:05:10.393 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.025) 0:05:10.419 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.027) 0:05:10.446 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.023) 0:05:10.470 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:65 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.087) 0:05:10.557 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.119) 0:05:10.677 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/foo-test1', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-0', 'encryption': False, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': 0, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': 'luks1', 'cache_size': 0, '_mount_id': '/dev/mapper/foo-test1', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:2 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.050) 0:05:10.727 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:10 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.059) 0:05:10.787 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:6 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.090) 0:05:10.877 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:10 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.042) 0:05:10.919 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1037768, "block_size": 4096, "block_total": 1046016, "block_used": 8248, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 4250697728, "size_total": 4284481536, "uuid": "1658ae73-9fdb-427e-a87c-e928d66979a6" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1037768, "block_size": 4096, "block_total": 1046016, "block_used": 8248, "device": "/dev/mapper/foo-test1", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 4250697728, "size_total": 4284481536, "uuid": "1658ae73-9fdb-427e-a87c-e928d66979a6" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:20 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.059) 0:05:10.979 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:29 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.064) 0:05:11.043 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:37 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.058) 0:05:11.102 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:46 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.062) 0:05:11.164 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:50 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.026) 0:05:11.191 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:55 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.027) 0:05:11.218 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:65 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.025) 0:05:11.244 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.039) 0:05:11.284 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:12 Thursday 21 July 2022 18:03:20 +0000 (0:00:00.069) 0:05:11.354 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:19 Thursday 21 July 2022 18:03:21 +0000 (0:00:00.049) 0:05:11.403 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 18:03:21 +0000 (0:00:00.060) 0:05:11.463 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:34 Thursday 21 July 2022 18:03:21 +0000 (0:00:00.038) 0:05:11.502 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:4 Thursday 21 July 2022 18:03:21 +0000 (0:00:00.038) 0:05:11.540 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:10 Thursday 21 July 2022 18:03:21 +0000 (0:00:00.040) 0:05:11.581 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:4 Thursday 21 July 2022 18:03:21 +0000 (0:00:00.040) 0:05:11.621 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426592.6327245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426592.6327245, "dev": 5, "device_type": 64512, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 105585, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658426592.6327245, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:10 Thursday 21 July 2022 18:03:21 +0000 (0:00:00.328) 0:05:11.950 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:15 Thursday 21 July 2022 18:03:21 +0000 (0:00:00.040) 0:05:11.990 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:21 Thursday 21 July 2022 18:03:21 +0000 (0:00:00.039) 0:05:12.030 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:25 Thursday 21 July 2022 18:03:21 +0000 (0:00:00.035) 0:05:12.065 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:30 Thursday 21 July 2022 18:03:21 +0000 (0:00:00.024) 0:05:12.089 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 18:03:21 +0000 (0:00:00.040) 0:05:12.130 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 18:03:21 +0000 (0:00:00.024) 0:05:12.155 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 18:03:22 +0000 (0:00:00.540) 0:05:12.696 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 18:03:22 +0000 (0:00:00.026) 0:05:12.722 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:27 Thursday 21 July 2022 18:03:22 +0000 (0:00:00.027) 0:05:12.749 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:33 Thursday 21 July 2022 18:03:22 +0000 (0:00:00.054) 0:05:12.804 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:39 Thursday 21 July 2022 18:03:22 +0000 (0:00:00.023) 0:05:12.827 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 18:03:22 +0000 (0:00:00.022) 0:05:12.850 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:50 Thursday 21 July 2022 18:03:22 +0000 (0:00:00.022) 0:05:12.872 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:56 Thursday 21 July 2022 18:03:22 +0000 (0:00:00.024) 0:05:12.896 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:62 Thursday 21 July 2022 18:03:22 +0000 (0:00:00.022) 0:05:12.919 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 18:03:22 +0000 (0:00:00.103) 0:05:13.022 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:72 Thursday 21 July 2022 18:03:22 +0000 (0:00:00.051) 0:05:13.074 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:78 Thursday 21 July 2022 18:03:22 +0000 (0:00:00.035) 0:05:13.109 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:84 Thursday 21 July 2022 18:03:22 +0000 (0:00:00.035) 0:05:13.145 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:90 Thursday 21 July 2022 18:03:22 +0000 (0:00:00.039) 0:05:13.185 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:7 Thursday 21 July 2022 18:03:22 +0000 (0:00:00.094) 0:05:13.279 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:13 Thursday 21 July 2022 18:03:22 +0000 (0:00:00.037) 0:05:13.316 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:17 Thursday 21 July 2022 18:03:22 +0000 (0:00:00.037) 0:05:13.354 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:21 Thursday 21 July 2022 18:03:23 +0000 (0:00:00.039) 0:05:13.393 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:25 Thursday 21 July 2022 18:03:23 +0000 (0:00:00.037) 0:05:13.430 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:31 Thursday 21 July 2022 18:03:23 +0000 (0:00:00.036) 0:05:13.467 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:37 Thursday 21 July 2022 18:03:23 +0000 (0:00:00.045) 0:05:13.513 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:3 Thursday 21 July 2022 18:03:23 +0000 (0:00:00.044) 0:05:13.557 ********* ok: [/cache/rhel-7.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:9 Thursday 21 July 2022 18:03:23 +0000 (0:00:00.352) 0:05:13.909 ********* ok: [/cache/rhel-7.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:15 Thursday 21 July 2022 18:03:23 +0000 (0:00:00.328) 0:05:14.238 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:20 Thursday 21 July 2022 18:03:23 +0000 (0:00:00.053) 0:05:14.291 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:25 Thursday 21 July 2022 18:03:23 +0000 (0:00:00.037) 0:05:14.329 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:28 Thursday 21 July 2022 18:03:23 +0000 (0:00:00.036) 0:05:14.366 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:31 Thursday 21 July 2022 18:03:24 +0000 (0:00:00.036) 0:05:14.402 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:36 Thursday 21 July 2022 18:03:24 +0000 (0:00:00.036) 0:05:14.439 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:39 Thursday 21 July 2022 18:03:24 +0000 (0:00:00.040) 0:05:14.479 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:44 Thursday 21 July 2022 18:03:24 +0000 (0:00:00.037) 0:05:14.517 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:47 Thursday 21 July 2022 18:03:24 +0000 (0:00:00.040) 0:05:14.557 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:50 Thursday 21 July 2022 18:03:24 +0000 (0:00:00.040) 0:05:14.597 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:6 Thursday 21 July 2022 18:03:24 +0000 (0:00:00.055) 0:05:14.653 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.026017", "end": "2022-07-21 14:03:24.531300", "rc": 0, "start": "2022-07-21 14:03:24.505283" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:14 Thursday 21 July 2022 18:03:24 +0000 (0:00:00.345) 0:05:14.999 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:17 Thursday 21 July 2022 18:03:24 +0000 (0:00:00.049) 0:05:15.049 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:22 Thursday 21 July 2022 18:03:24 +0000 (0:00:00.049) 0:05:15.098 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:26 Thursday 21 July 2022 18:03:24 +0000 (0:00:00.038) 0:05:15.137 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:32 Thursday 21 July 2022 18:03:24 +0000 (0:00:00.034) 0:05:15.172 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:36 Thursday 21 July 2022 18:03:24 +0000 (0:00:00.034) 0:05:15.207 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:16 Thursday 21 July 2022 18:03:24 +0000 (0:00:00.034) 0:05:15.241 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:43 Thursday 21 July 2022 18:03:24 +0000 (0:00:00.040) 0:05:15.282 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:53 Thursday 21 July 2022 18:03:24 +0000 (0:00:00.024) 0:05:15.306 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [create a file] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/create-test-file.yml:10 Thursday 21 July 2022 18:03:24 +0000 (0:00:00.035) 0:05:15.342 ********* changed: [/cache/rhel-7.qcow2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:488 Thursday 21 July 2022 18:03:25 +0000 (0:00:00.328) 0:05:15.671 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:03:25 +0000 (0:00:00.038) 0:05:15.710 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:03:25 +0000 (0:00:00.034) 0:05:15.744 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:03:25 +0000 (0:00:00.418) 0:05:16.163 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:03:25 +0000 (0:00:00.157) 0:05:16.321 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:03:25 +0000 (0:00:00.035) 0:05:16.356 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:03:26 +0000 (0:00:00.037) 0:05:16.394 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:03:26 +0000 (0:00:00.057) 0:05:16.451 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:03:26 +0000 (0:00:00.021) 0:05:16.473 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:03:26 +0000 (0:00:00.748) 0:05:17.222 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:03:26 +0000 (0:00:00.042) 0:05:17.264 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:03:26 +0000 (0:00:00.038) 0:05:17.302 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:03:27 +0000 (0:00:01.073) 0:05:18.376 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:03:28 +0000 (0:00:00.046) 0:05:18.423 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:03:28 +0000 (0:00:00.037) 0:05:18.460 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:03:28 +0000 (0:00:00.045) 0:05:18.506 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:03:28 +0000 (0:00:00.037) 0:05:18.544 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed" ] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:03:28 +0000 (0:00:00.527) 0:05:19.072 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@259:1.service": { "name": "lvm2-pvscan@259:1.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service": { "name": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:03:29 +0000 (0:00:01.018) 0:05:20.090 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:03:29 +0000 (0:00:00.060) 0:05:20.151 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2dcb4d86dd\x2d641f\x2d4b08\x2dabde\x2dc73b587de633.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "name": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service cryptsetup-pre.target systemd-readahead-replay.service system-systemd\\x2dcryptsetup.slice systemd-journald.socket dev-mapper-foo\\x2dtest1.device", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-cb4d86dd-641f-4b08-abde-c73b587de633", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-cb4d86dd-641f-4b08-abde-c73b587de633 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-cb4d86dd-641f-4b08-abde-c73b587de633 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:03:30 +0000 (0:00:00.478) 0:05:20.629 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : failed message] ********************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:86 Thursday 21 July 2022 18:03:31 +0000 (0:00:01.193) 0:05:21.823 ********* fatal: [/cache/rhel-7.qcow2]: FAILED! => { "changed": false } MSG: {'crypts': [], 'mounts': [], 'leaves': [], 'changed': False, 'actions': [], 'failed': True, 'volumes': [], 'invocation': {'module_args': {'packages_only': False, 'disklabel_type': None, 'diskvolume_mkfs_option_map': {'ext4': '-F', 'ext3': '-F', 'ext2': '-F'}, 'safe_mode': True, 'pools': [{'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, 'raid_spare_count': None, 'raid_disks': [], 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'lvm', 'encryption_cipher': None, 'raid_spare_count': None}], 'volumes': [], 'pool_defaults': {'encryption_password': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_cipher': None, 'disks': [], 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_device_count': None, 'state': 'present', 'volumes': [], 'raid_chunk_size': None, 'type': 'lvm', 'raid_level': None, 'raid_spare_count': None}, 'volume_defaults': {'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'xfs', 'mount_options': 'defaults', 'size': 0, 'mount_point': '', 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'fs_overwrite_existing': True, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'mount_passno': 0, 'raid_spare_count': None, 'cache_mode': None, 'deduplication': None, 'cached': False, 'type': 'lvm', 'disks': [], 'thin_pool_size': None, 'thin': None, 'mount_check': 0, 'cache_size': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}, 'use_partitions': None}}, 'pools': [], 'packages': [], 'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:03:31 +0000 (0:00:00.040) 0:05:21.864 ********* changed: [/cache/rhel-7.qcow2] => (item=systemd-cryptsetup@luks\x2dcb4d86dd\x2d641f\x2d4b08\x2dabde\x2dc73b587de633.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "name": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "7149", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "7149", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dcb4d86dd\\x2d641f\\x2d4b08\\x2dabde\\x2dc73b587de633.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:508 Thursday 21 July 2022 18:03:31 +0000 (0:00:00.528) 0:05:22.392 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the output of the safe_mode test] ********************************* task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:514 Thursday 21 July 2022 18:03:32 +0000 (0:00:00.035) 0:05:22.428 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [stat the file] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml:10 Thursday 21 July 2022 18:03:32 +0000 (0:00:00.089) 0:05:22.517 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426605.2007246, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658426605.2007246, "dev": 64512, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1658426605.2007246, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "815415956", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [assert file presence] **************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-data-preservation.yml:15 Thursday 21 July 2022 18:03:32 +0000 (0:00:00.325) 0:05:22.842 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:521 Thursday 21 July 2022 18:03:32 +0000 (0:00:00.040) 0:05:22.883 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:03:32 +0000 (0:00:00.040) 0:05:22.923 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:03:32 +0000 (0:00:00.036) 0:05:22.960 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:03:33 +0000 (0:00:00.455) 0:05:23.415 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:03:33 +0000 (0:00:00.068) 0:05:23.484 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:03:33 +0000 (0:00:00.039) 0:05:23.523 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:03:33 +0000 (0:00:00.038) 0:05:23.562 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:03:33 +0000 (0:00:00.059) 0:05:23.621 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:03:33 +0000 (0:00:00.020) 0:05:23.642 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:03:34 +0000 (0:00:00.753) 0:05:24.395 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": [ { "disks": [ "nvme1n1" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:03:34 +0000 (0:00:00.039) 0:05:24.434 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:03:34 +0000 (0:00:00.040) 0:05:24.475 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:03:35 +0000 (0:00:01.080) 0:05:25.556 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:03:35 +0000 (0:00:00.047) 0:05:25.603 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:03:35 +0000 (0:00:00.034) 0:05:25.638 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:03:35 +0000 (0:00:00.038) 0:05:25.677 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:03:35 +0000 (0:00:00.033) 0:05:25.710 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed" ] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:03:35 +0000 (0:00:00.524) 0:05:26.234 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@259:1.service": { "name": "lvm2-pvscan@259:1.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:03:36 +0000 (0:00:01.004) 0:05:27.239 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:03:36 +0000 (0:00:00.062) 0:05:27.301 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:03:36 +0000 (0:00:00.021) 0:05:27.323 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "password": "-", "state": "present" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 18:03:44 +0000 (0:00:07.733) 0:05:35.057 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:03:44 +0000 (0:00:00.039) 0:05:35.096 ********* TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 18:03:44 +0000 (0:00:00.022) 0:05:35.118 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd", "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 18:03:44 +0000 (0:00:00.042) 0:05:35.160 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 18:03:44 +0000 (0:00:00.041) 0:05:35.202 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 18:03:44 +0000 (0:00:00.039) 0:05:35.242 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/foo-test1', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 18:03:45 +0000 (0:00:00.412) 0:05:35.655 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 18:03:45 +0000 (0:00:00.469) 0:05:36.125 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a', 'dump': 0, 'passno': 0, 'fstype': 'xfs', 'state': 'mounted', 'path': '/opt/test1', 'opts': 'defaults'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 18:03:46 +0000 (0:00:00.476) 0:05:36.601 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 18:03:46 +0000 (0:00:00.456) 0:05:37.058 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426597.7897246, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658426595.7467246, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 12585524, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658426595.7457247, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072742524864", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 18:03:47 +0000 (0:00:00.344) 0:05:37.402 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'present', 'password': '-', 'name': 'luks-cad32e3f-2d91-4801-814a-1715e2ef420a', 'backing_device': '/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 18:03:47 +0000 (0:00:00.368) 0:05:37.771 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:537 Thursday 21 July 2022 18:03:48 +0000 (0:00:00.847) 0:05:38.619 ********* included: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:1 Thursday 21 July 2022 18:03:48 +0000 (0:00:00.040) 0:05:38.659 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_pools_list": [ { "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "yabbadabbadoo", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:6 Thursday 21 July 2022 18:03:48 +0000 (0:00:00.053) 0:05:38.713 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:14 Thursday 21 July 2022 18:03:48 +0000 (0:00:00.037) 0:05:38.751 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "cad32e3f-2d91-4801-814a-1715e2ef420a" }, "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a": { "fstype": "xfs", "label": "", "name": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "size": "4G", "type": "crypt", "uuid": "266a26f5-110a-4da0-b857-97272e2959ba" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "LVM2_member", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "xAhHrq-vxRd-zzpO-c8hA-fKZ2-JwiL-Cq4NyT" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-17-57-57-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:19 Thursday 21 July 2022 18:03:48 +0000 (0:00:00.312) 0:05:39.063 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003377", "end": "2022-07-21 14:03:48.894839", "rc": 0, "start": "2022-07-21 14:03:48.891462" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 /dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:24 Thursday 21 July 2022 18:03:48 +0000 (0:00:00.295) 0:05:39.359 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003394", "end": "2022-07-21 14:03:49.183741", "failed_when_result": false, "rc": 0, "start": "2022-07-21 14:03:49.180347" } STDOUT: luks-cad32e3f-2d91-4801-814a-1715e2ef420a /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:33 Thursday 21 July 2022 18:03:49 +0000 (0:00:00.290) 0:05:39.649 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml for /cache/rhel-7.qcow2 => (item={'name': 'foo', 'encryption_password': None, 'state': 'present', 'raid_metadata_version': None, 'encryption': False, 'encryption_key_size': None, 'disks': ['nvme1n1'], 'raid_level': None, 'encryption_luks_version': None, 'raid_device_count': None, 'encryption_key': None, 'volumes': [{'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}], 'raid_chunk_size': None, 'type': 'lvm', 'encryption_cipher': None, 'raid_spare_count': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml:5 Thursday 21 July 2022 18:03:49 +0000 (0:00:00.058) 0:05:39.708 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool.yml:18 Thursday 21 July 2022 18:03:49 +0000 (0:00:00.033) 0:05:39.741 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml for /cache/rhel-7.qcow2 => (item=members) included: /tmp/tmptomayb7j/tests/storage/test-verify-pool-volumes.yml for /cache/rhel-7.qcow2 => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:1 Thursday 21 July 2022 18:03:49 +0000 (0:00:00.046) 0:05:39.788 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/nvme1n1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:6 Thursday 21 July 2022 18:03:49 +0000 (0:00:00.058) 0:05:39.846 ********* ok: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/nvme1n1", "pv": "/dev/nvme1n1" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:15 Thursday 21 July 2022 18:03:49 +0000 (0:00:00.321) 0:05:40.168 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:19 Thursday 21 July 2022 18:03:49 +0000 (0:00:00.048) 0:05:40.217 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/nvme1n1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:23 Thursday 21 July 2022 18:03:49 +0000 (0:00:00.048) 0:05:40.265 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:29 Thursday 21 July 2022 18:03:49 +0000 (0:00:00.049) 0:05:40.314 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:33 Thursday 21 July 2022 18:03:49 +0000 (0:00:00.041) 0:05:40.356 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:37 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.051) 0:05:40.408 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:41 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.025) 0:05:40.434 ********* ok: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/nvme1n1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:50 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.044) 0:05:40.478 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml for /cache/rhel-7.qcow2 TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:6 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.044) 0:05:40.523 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:12 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.025) 0:05:40.548 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:16 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.025) 0:05:40.574 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:20 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.072) 0:05:40.646 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:24 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.025) 0:05:40.672 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:30 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.024) 0:05:40.697 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:36 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.025) 0:05:40.722 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-md.yml:44 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.026) 0:05:40.749 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:53 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.036) 0:05:40.785 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-lvmraid.yml for /cache/rhel-7.qcow2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.045) 0:05:40.831 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-member-lvmraid.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.045) 0:05:40.877 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.027) 0:05:40.904 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.027) 0:05:40.932 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:56 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.029) 0:05:40.962 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-thin.yml for /cache/rhel-7.qcow2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-thin.yml:1 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.046) 0:05:41.008 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml:3 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.045) 0:05:41.054 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml:8 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.029) 0:05:41.083 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml:13 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.024) 0:05:41.108 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-thin.yml:17 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.030) 0:05:41.138 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:59 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.025) 0:05:41.163 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml for /cache/rhel-7.qcow2 TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.049) 0:05:41.212 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.053) 0:05:41.266 ********* skipping: [/cache/rhel-7.qcow2] => (item=/dev/nvme1n1) => { "_storage_test_pool_member_path": "/dev/nvme1n1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.029) 0:05:41.295 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml for /cache/rhel-7.qcow2 => (item=/dev/nvme1n1) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:03:50 +0000 (0:00:00.044) 0:05:41.340 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.054) 0:05:41.394 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.053) 0:05:41.448 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.037) 0:05:41.485 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.039) 0:05:41.525 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.038) 0:05:41.563 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.036) 0:05:41.600 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:62 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.035) 0:05:41.636 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-members-vdo.yml for /cache/rhel-7.qcow2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.046) 0:05:41.682 ********* included: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.043) 0:05:41.725 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.025) 0:05:41.751 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.023) 0:05:41.774 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.022) 0:05:41.797 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.023) 0:05:41.820 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.064) 0:05:41.885 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.023) 0:05:41.909 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.023) 0:05:41.932 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-members.yml:65 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.034) 0:05:41.967 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.035) 0:05:42.002 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/mapper/foo-test1', 'raid_metadata_version': None, 'raid_level': None, 'fs_type': 'xfs', 'mount_options': 'defaults', 'type': 'lvm', '_device': '/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a', 'size': '4g', 'mount_point': '/opt/test1', 'compression': None, 'encryption_password': 'yabbadabbadoo', '_kernel_device': '/dev/dm-1', 'encryption': True, 'mount_device_identifier': 'uuid', 'name': 'test1', 'raid_device_count': None, 'state': 'present', 'vdo_pool_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': '/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a', 'raid_spare_count': None, 'raid_disks': [], '_raw_kernel_device': '/dev/dm-0', 'cache_mode': None, 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': False, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'cache_devices': [], 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:2 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.040) 0:05:42.042 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:10 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.048) 0:05:42.091 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:6 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.079) 0:05:42.170 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:10 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.044) 0:05:42.214 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1037256, "block_size": 4096, "block_total": 1045504, "block_used": 8248, "device": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 4248600576, "size_total": 4282384384, "uuid": "266a26f5-110a-4da0-b857-97272e2959ba" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1037256, "block_size": 4096, "block_total": 1045504, "block_used": 8248, "device": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "fstype": "xfs", "inode_available": 2096125, "inode_total": 2096128, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,noquota", "size_available": 4248600576, "size_total": 4282384384, "uuid": "266a26f5-110a-4da0-b857-97272e2959ba" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:20 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.060) 0:05:42.275 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:29 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.053) 0:05:42.328 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:37 Thursday 21 July 2022 18:03:51 +0000 (0:00:00.049) 0:05:42.378 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:46 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.049) 0:05:42.427 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:50 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.022) 0:05:42.449 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:55 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.024) 0:05:42.473 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:65 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.022) 0:05:42.496 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.033) 0:05:42.530 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:12 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.061) 0:05:42.592 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:19 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.048) 0:05:42.640 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.053) 0:05:42.693 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:34 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.035) 0:05:42.729 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:4 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.039) 0:05:42.769 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:10 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.039) 0:05:42.808 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:4 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.039) 0:05:42.847 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426624.4197245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426624.4197245, "dev": 5, "device_type": 64512, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 105585, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658426624.4197245, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:10 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.329) 0:05:43.177 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:15 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.042) 0:05:43.220 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:21 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.051) 0:05:43.271 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:25 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.045) 0:05:43.316 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:30 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.026) 0:05:43.343 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 18:03:52 +0000 (0:00:00.042) 0:05:43.385 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426624.5347245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426624.5347245, "dev": 5, "device_type": 64513, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 114912, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658426624.5347245, "nlink": 1, "path": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 18:03:53 +0000 (0:00:00.360) 0:05:43.746 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 18:03:53 +0000 (0:00:00.590) 0:05:44.337 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.032326", "end": "2022-07-21 14:03:54.265325", "rc": 0, "start": "2022-07-21 14:03:54.232999" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 4096 MK bits: 512 MK digest: 52 1e 0f 9d fd 74 a5 c8 da ce 90 f6 31 71 c3 87 6a 2b 3d 07 MK salt: 05 8e 8c 30 e6 3a 7c cc 57 00 05 cb 1b 04 f9 90 bc 45 75 82 f9 58 cf 9b 15 ab 88 d5 d4 35 50 b2 MK iterations: 22598 UUID: cad32e3f-2d91-4801-814a-1715e2ef420a Key Slot 0: ENABLED Iterations: 363080 Salt: 2d 33 e5 04 c5 15 57 1e 7b d2 11 cb f9 51 e0 14 6b 13 65 00 0f 4f 76 8e 78 c6 f0 b0 4c d4 4c d5 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 18:03:54 +0000 (0:00:00.393) 0:05:44.730 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:27 Thursday 21 July 2022 18:03:54 +0000 (0:00:00.103) 0:05:44.833 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:33 Thursday 21 July 2022 18:03:54 +0000 (0:00:00.107) 0:05:44.941 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:39 Thursday 21 July 2022 18:03:54 +0000 (0:00:00.116) 0:05:45.058 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 18:03:54 +0000 (0:00:00.039) 0:05:45.098 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:50 Thursday 21 July 2022 18:03:54 +0000 (0:00:00.024) 0:05:45.122 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:56 Thursday 21 July 2022 18:03:54 +0000 (0:00:00.025) 0:05:45.147 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:62 Thursday 21 July 2022 18:03:54 +0000 (0:00:00.023) 0:05:45.171 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-cad32e3f-2d91-4801-814a-1715e2ef420a /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 18:03:54 +0000 (0:00:00.054) 0:05:45.225 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:72 Thursday 21 July 2022 18:03:54 +0000 (0:00:00.049) 0:05:45.275 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:78 Thursday 21 July 2022 18:03:54 +0000 (0:00:00.050) 0:05:45.326 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:84 Thursday 21 July 2022 18:03:54 +0000 (0:00:00.051) 0:05:45.377 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:90 Thursday 21 July 2022 18:03:55 +0000 (0:00:00.052) 0:05:45.430 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:7 Thursday 21 July 2022 18:03:55 +0000 (0:00:00.040) 0:05:45.471 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:13 Thursday 21 July 2022 18:03:55 +0000 (0:00:00.041) 0:05:45.512 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:17 Thursday 21 July 2022 18:03:55 +0000 (0:00:00.039) 0:05:45.551 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:21 Thursday 21 July 2022 18:03:55 +0000 (0:00:00.036) 0:05:45.588 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:25 Thursday 21 July 2022 18:03:55 +0000 (0:00:00.039) 0:05:45.628 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:31 Thursday 21 July 2022 18:03:55 +0000 (0:00:00.037) 0:05:45.666 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:37 Thursday 21 July 2022 18:03:55 +0000 (0:00:00.035) 0:05:45.701 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:3 Thursday 21 July 2022 18:03:55 +0000 (0:00:00.035) 0:05:45.737 ********* ok: [/cache/rhel-7.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:9 Thursday 21 July 2022 18:03:55 +0000 (0:00:00.320) 0:05:46.057 ********* ok: [/cache/rhel-7.qcow2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:15 Thursday 21 July 2022 18:03:55 +0000 (0:00:00.313) 0:05:46.370 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:20 Thursday 21 July 2022 18:03:56 +0000 (0:00:00.050) 0:05:46.421 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:25 Thursday 21 July 2022 18:03:56 +0000 (0:00:00.036) 0:05:46.458 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:28 Thursday 21 July 2022 18:03:56 +0000 (0:00:00.040) 0:05:46.498 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:31 Thursday 21 July 2022 18:03:56 +0000 (0:00:00.041) 0:05:46.540 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:36 Thursday 21 July 2022 18:03:56 +0000 (0:00:00.036) 0:05:46.576 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:39 Thursday 21 July 2022 18:03:56 +0000 (0:00:00.034) 0:05:46.611 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:44 Thursday 21 July 2022 18:03:56 +0000 (0:00:00.042) 0:05:46.654 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:47 Thursday 21 July 2022 18:03:56 +0000 (0:00:00.037) 0:05:46.691 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:50 Thursday 21 July 2022 18:03:56 +0000 (0:00:00.040) 0:05:46.731 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:6 Thursday 21 July 2022 18:03:56 +0000 (0:00:00.051) 0:05:46.783 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.021682", "end": "2022-07-21 14:03:56.657289", "rc": 0, "start": "2022-07-21 14:03:56.635607" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:14 Thursday 21 July 2022 18:03:56 +0000 (0:00:00.340) 0:05:47.124 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:17 Thursday 21 July 2022 18:03:56 +0000 (0:00:00.051) 0:05:47.175 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:22 Thursday 21 July 2022 18:03:56 +0000 (0:00:00.053) 0:05:47.229 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:26 Thursday 21 July 2022 18:03:56 +0000 (0:00:00.044) 0:05:47.273 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:32 Thursday 21 July 2022 18:03:56 +0000 (0:00:00.040) 0:05:47.314 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:36 Thursday 21 July 2022 18:03:56 +0000 (0:00:00.039) 0:05:47.354 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:16 Thursday 21 July 2022 18:03:56 +0000 (0:00:00.038) 0:05:47.393 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:43 Thursday 21 July 2022 18:03:57 +0000 (0:00:00.040) 0:05:47.434 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:53 Thursday 21 July 2022 18:03:57 +0000 (0:00:00.023) 0:05:47.457 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:539 Thursday 21 July 2022 18:03:57 +0000 (0:00:00.039) 0:05:47.497 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:03:57 +0000 (0:00:00.062) 0:05:47.560 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:03:57 +0000 (0:00:00.038) 0:05:47.599 ********* ok: [/cache/rhel-7.qcow2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:03:57 +0000 (0:00:00.428) 0:05:48.028 ********* skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/rhel-7.qcow2] => (item=RedHat_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.yml" } skipping: [/cache/rhel-7.qcow2] => (item=RedHat_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:03:57 +0000 (0:00:00.124) 0:05:48.153 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:03:57 +0000 (0:00:00.040) 0:05:48.193 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:03:57 +0000 (0:00:00.037) 0:05:48.231 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:03:57 +0000 (0:00:00.113) 0:05:48.345 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:03:57 +0000 (0:00:00.023) 0:05:48.368 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed" ] } TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:03:58 +0000 (0:00:00.757) 0:05:49.126 ********* ok: [/cache/rhel-7.qcow2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:03:58 +0000 (0:00:00.042) 0:05:49.169 ********* ok: [/cache/rhel-7.qcow2] => { "storage_volumes": [ { "disks": [ "nvme1n1" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:03:58 +0000 (0:00:00.038) 0:05:49.207 ********* ok: [/cache/rhel-7.qcow2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:03:59 +0000 (0:00:01.171) 0:05:50.379 ********* included: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-7.qcow2 TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:04:00 +0000 (0:00:00.051) 0:05:50.430 ********* TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:04:00 +0000 (0:00:00.040) 0:05:50.471 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:04:00 +0000 (0:00:00.041) 0:05:50.512 ********* TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:04:00 +0000 (0:00:00.037) 0:05:50.550 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [] } TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:04:00 +0000 (0:00:00.539) 0:05:51.089 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "stopped", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "exim.service": { "name": "exim.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@259:1.service": { "name": "lvm2-pvscan@259:1.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "active" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure-server.service": { "name": "nfs-secure-server.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ovirt-guest-agent.service": { "name": "ovirt-guest-agent.service", "source": "systemd", "state": "stopped", "status": "failed" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rhcd.service": { "name": "rhcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhnsd": { "name": "rhnsd", "source": "sysv", "state": "running", "status": "enabled" }, "rhnsd.service": { "name": "rhnsd.service", "source": "systemd", "state": "running", "status": "active" }, "rhsm-facts.service": { "name": "rhsm-facts.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsm.service": { "name": "rhsm.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rhsmcertd.service": { "name": "rhsmcertd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "active" }, "sendmail.service": { "name": "sendmail.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:04:01 +0000 (0:00:01.025) 0:05:52.115 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:04:01 +0000 (0:00:00.059) 0:05:52.174 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:04:01 +0000 (0:00:00.021) 0:05:52.195 ********* changed: [/cache/rhel-7.qcow2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/nvme1n1", "_mount_id": "UUID=xAhHrq-vxRd-zzpO-c8hA-fKZ2-JwiL-Cq4NyT", "_raw_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 18:04:33 +0000 (0:00:31.884) 0:06:24.080 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:04:33 +0000 (0:00:00.042) 0:06:24.122 ********* TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 18:04:33 +0000 (0:00:00.021) 0:06:24.144 ********* ok: [/cache/rhel-7.qcow2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/nvme1n1", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb", "/dev/vdc", "/dev/vdd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/nvme1n1", "_mount_id": "UUID=xAhHrq-vxRd-zzpO-c8hA-fKZ2-JwiL-Cq4NyT", "_raw_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 18:04:33 +0000 (0:00:00.089) 0:06:24.233 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 18:04:33 +0000 (0:00:00.040) 0:06:24.274 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/nvme1n1", "_mount_id": "UUID=xAhHrq-vxRd-zzpO-c8hA-fKZ2-JwiL-Cq4NyT", "_raw_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 18:04:33 +0000 (0:00:00.040) 0:06:24.314 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/rhel-7.qcow2] => (item={'src': '/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a', 'state': 'absent', 'path': '/opt/test1', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-cad32e3f-2d91-4801-814a-1715e2ef420a" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 18:04:34 +0000 (0:00:00.399) 0:06:24.714 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 18:04:34 +0000 (0:00:00.469) 0:06:25.183 ********* TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 18:04:34 +0000 (0:00:00.042) 0:06:25.226 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 18:04:35 +0000 (0:00:00.534) 0:06:25.760 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426629.1827245, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "44e3be7d6443940bf1cbe1a0c1c87561496ac2c4", "ctime": 1658426627.2977245, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 8521544, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1658426627.2977245, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1012482526", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 18:04:35 +0000 (0:00:00.397) 0:06:26.158 ********* changed: [/cache/rhel-7.qcow2] => (item={'state': 'absent', 'password': '-', 'name': 'luks-cad32e3f-2d91-4801-814a-1715e2ef420a', 'backing_device': '/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cad32e3f-2d91-4801-814a-1715e2ef420a", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 18:04:36 +0000 (0:00:00.346) 0:06:26.505 ********* ok: [/cache/rhel-7.qcow2] META: role_complete for /cache/rhel-7.qcow2 TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/tests_luks.yml:549 Thursday 21 July 2022 18:04:36 +0000 (0:00:00.850) 0:06:27.355 ********* included: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml for /cache/rhel-7.qcow2 TASK [Print out pool information] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:1 Thursday 21 July 2022 18:04:37 +0000 (0:00:00.050) 0:06:27.406 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:6 Thursday 21 July 2022 18:04:37 +0000 (0:00:00.041) 0:06:27.448 ********* ok: [/cache/rhel-7.qcow2] => { "_storage_volumes_list": [ { "_device": "/dev/nvme1n1", "_mount_id": "UUID=xAhHrq-vxRd-zzpO-c8hA-fKZ2-JwiL-Cq4NyT", "_raw_device": "/dev/nvme1n1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "nvme1n1" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:14 Thursday 21 July 2022 18:04:37 +0000 (0:00:00.055) 0:06:27.503 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "info": { "/dev/fd0": { "fstype": "", "label": "", "name": "/dev/fd0", "size": "4K", "type": "disk", "uuid": "" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-17-57-57-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "xfs", "label": "", "name": "/dev/vda1", "size": "10G", "type": "partition", "uuid": "21864ae1-1c29-4009-a1c2-151e41d0e053" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:19 Thursday 21 July 2022 18:04:37 +0000 (0:00:00.332) 0:06:27.836 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003459", "end": "2022-07-21 14:04:37.684839", "rc": 0, "start": "2022-07-21 14:04:37.681380" } STDOUT: # # /etc/fstab # Created by anaconda on Tue Jul 19 03:15:15 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=21864ae1-1c29-4009-a1c2-151e41d0e053 / xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:24 Thursday 21 July 2022 18:04:37 +0000 (0:00:00.317) 0:06:28.154 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003473", "end": "2022-07-21 14:04:37.994574", "failed_when_result": false, "rc": 0, "start": "2022-07-21 14:04:37.991101" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:33 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.309) 0:06:28.464 ********* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:43 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.022) 0:06:28.486 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml for /cache/rhel-7.qcow2 => (item={'_raw_device': '/dev/nvme1n1', 'raid_metadata_version': None, 'mount_device_identifier': 'uuid', 'fs_type': 'lvmpv', 'mount_options': 'defaults', '_device': '/dev/nvme1n1', 'size': 10737418240, 'mount_point': None, 'compression': None, 'encryption_password': None, 'encryption': False, 'raid_level': None, 'raid_device_count': None, 'state': 'absent', 'vdo_pool_size': None, 'thin_pool_name': None, 'type': 'disk', 'encryption_key_size': None, 'encryption_cipher': None, 'encryption_key': None, 'fs_label': '', 'encryption_luks_version': None, 'cache_size': 0, '_mount_id': 'UUID=xAhHrq-vxRd-zzpO-c8hA-fKZ2-JwiL-Cq4NyT', 'raid_spare_count': None, 'name': 'foo', 'cache_mode': None, 'cache_devices': [], 'deduplication': None, 'cached': False, 'fs_overwrite_existing': True, 'disks': ['nvme1n1'], 'thin': None, 'mount_check': 0, 'mount_passno': 0, 'raid_chunk_size': None, 'thin_pool_size': None, 'fs_create_options': ''}) TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:2 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.059) 0:06:28.546 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:10 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.052) 0:06:28.599 ********* included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml for /cache/rhel-7.qcow2 => (item=mount) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml for /cache/rhel-7.qcow2 => (item=fstab) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml for /cache/rhel-7.qcow2 => (item=fs) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml for /cache/rhel-7.qcow2 => (item=device) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml for /cache/rhel-7.qcow2 => (item=encryption) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml for /cache/rhel-7.qcow2 => (item=md) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml for /cache/rhel-7.qcow2 => (item=size) included: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml for /cache/rhel-7.qcow2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:6 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.079) 0:06:28.678 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_device_path": "/dev/nvme1n1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:10 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.046) 0:06:28.724 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:20 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.056) 0:06:28.781 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:29 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.026) 0:06:28.807 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:37 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.049) 0:06:28.857 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:46 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.037) 0:06:28.894 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:50 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.025) 0:06:28.919 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:55 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.025) 0:06:28.945 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-mount.yml:65 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.023) 0:06:28.968 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.033) 0:06:29.001 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:12 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.110) 0:06:29.112 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:19 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.023) 0:06:29.136 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.096) 0:06:29.232 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fstab.yml:34 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.038) 0:06:29.270 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:4 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.085) 0:06:29.355 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-fs.yml:10 Thursday 21 July 2022 18:04:38 +0000 (0:00:00.023) 0:06:29.379 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:4 Thursday 21 July 2022 18:04:39 +0000 (0:00:00.023) 0:06:29.402 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "stat": { "atime": 1658426673.5517247, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658426673.5517247, "dev": 5, "device_type": 66305, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 10779, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1658426673.5517247, "nlink": 1, "path": "/dev/nvme1n1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:10 Thursday 21 July 2022 18:04:39 +0000 (0:00:00.404) 0:06:29.807 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:15 Thursday 21 July 2022 18:04:39 +0000 (0:00:00.037) 0:06:29.844 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:21 Thursday 21 July 2022 18:04:39 +0000 (0:00:00.022) 0:06:29.867 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:25 Thursday 21 July 2022 18:04:39 +0000 (0:00:00.037) 0:06:29.904 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-device.yml:30 Thursday 21 July 2022 18:04:39 +0000 (0:00:00.026) 0:06:29.931 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 18:04:39 +0000 (0:00:00.024) 0:06:29.956 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 18:04:39 +0000 (0:00:00.023) 0:06:29.979 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.535) 0:06:30.514 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.025) 0:06:30.540 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:27 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.025) 0:06:30.565 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:33 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.023) 0:06:30.588 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:39 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.022) 0:06:30.611 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.021) 0:06:30.633 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:50 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.023) 0:06:30.656 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:56 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.024) 0:06:30.680 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:62 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.022) 0:06:30.703 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.053) 0:06:30.756 ********* ok: [/cache/rhel-7.qcow2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:72 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.055) 0:06:30.812 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:78 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.036) 0:06:30.848 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:84 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.034) 0:06:30.882 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-encryption.yml:90 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.033) 0:06:30.916 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:7 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.036) 0:06:30.952 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:13 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.033) 0:06:30.986 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:17 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.034) 0:06:31.021 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:21 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.034) 0:06:31.055 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:25 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.036) 0:06:31.092 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:31 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.034) 0:06:31.127 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-md.yml:37 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.035) 0:06:31.162 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:3 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.039) 0:06:31.201 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:9 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.025) 0:06:31.227 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:15 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.035) 0:06:31.262 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:20 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.039) 0:06:31.301 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:25 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.037) 0:06:31.338 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:28 Thursday 21 July 2022 18:04:40 +0000 (0:00:00.036) 0:06:31.374 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:31 Thursday 21 July 2022 18:04:41 +0000 (0:00:00.034) 0:06:31.409 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:36 Thursday 21 July 2022 18:04:41 +0000 (0:00:00.034) 0:06:31.443 ********* skipping: [/cache/rhel-7.qcow2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:39 Thursday 21 July 2022 18:04:41 +0000 (0:00:00.035) 0:06:31.479 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:44 Thursday 21 July 2022 18:04:41 +0000 (0:00:00.036) 0:06:31.516 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:47 Thursday 21 July 2022 18:04:41 +0000 (0:00:00.034) 0:06:31.550 ********* ok: [/cache/rhel-7.qcow2] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-size.yml:50 Thursday 21 July 2022 18:04:41 +0000 (0:00:00.077) 0:06:31.627 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:6 Thursday 21 July 2022 18:04:41 +0000 (0:00:00.022) 0:06:31.650 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:14 Thursday 21 July 2022 18:04:41 +0000 (0:00:00.023) 0:06:31.674 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:17 Thursday 21 July 2022 18:04:41 +0000 (0:00:00.021) 0:06:31.696 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:22 Thursday 21 July 2022 18:04:41 +0000 (0:00:00.021) 0:06:31.717 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:26 Thursday 21 July 2022 18:04:41 +0000 (0:00:00.021) 0:06:31.739 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:32 Thursday 21 July 2022 18:04:41 +0000 (0:00:00.060) 0:06:31.799 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume-cache.yml:36 Thursday 21 July 2022 18:04:41 +0000 (0:00:00.023) 0:06:31.822 ********* skipping: [/cache/rhel-7.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmptomayb7j/tests/storage/test-verify-volume.yml:16 Thursday 21 July 2022 18:04:41 +0000 (0:00:00.022) 0:06:31.845 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:53 Thursday 21 July 2022 18:04:41 +0000 (0:00:00.033) 0:06:31.878 ********* ok: [/cache/rhel-7.qcow2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-7.qcow2 : ok=1178 changed=63 unreachable=0 failed=9 skipped=670 rescued=9 ignored=0 Thursday 21 July 2022 18:04:41 +0000 (0:00:00.045) 0:06:31.924 ********* =============================================================================== fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state -- 61.96s /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state -- 31.88s /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 fedora.linux_system_roles.storage : make sure blivet is available ------- 8.94s /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 8.13s /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 7.83s /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 7.73s /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 7.63s /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 7.39s /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 7.35s /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 fedora.linux_system_roles.storage : Update facts ------------------------ 1.85s /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 1.50s /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 fedora.linux_system_roles.storage : make sure required packages are installed --- 1.48s /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 set up internal repositories -------------------------------------------- 1.37s /cache/rhel-7_setup.yml:5 ----------------------------------------------------- fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 1.34s /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Collect info about the volumes. ----------------------------------------- 1.32s /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:14 --------------------- Collect info about the volumes. ----------------------------------------- 1.32s /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:14 --------------------- Collect info about the volumes. ----------------------------------------- 1.32s /tmp/tmptomayb7j/tests/storage/verify-role-results.yml:14 --------------------- fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 1.21s /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 1.19s /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 1.19s /tmp/tmp5bkr4li_/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64