ansible-playbook [core 2.17.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-CuQ executable location = /usr/local/bin/ansible-playbook python version = 3.12.11 (main, Aug 14 2025, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-11)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_default.yml **************************************************** 1 plays in /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/tests/gfs2/tests_default.yml PLAY [Basic usability test] **************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/tests/gfs2/tests_default.yml:3 Saturday 25 October 2025 18:05:54 -0400 (0:00:00.017) 0:00:00.017 ****** [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed-node1] TASK [Run the role] ************************************************************ task path: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/tests/gfs2/tests_default.yml:15 Saturday 25 October 2025 18:05:55 -0400 (0:00:01.010) 0:00:01.027 ****** included: fedora.linux_system_roles.gfs2 for managed-node1 TASK [fedora.linux_system_roles.gfs2 : Validating arguments against arg spec 'main' - The gfs2 role.] *** task path: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/tests/gfs2/tests_default.yml:3 Saturday 25 October 2025 18:05:55 -0400 (0:00:00.037) 0:00:01.065 ****** ok: [managed-node1] => { "changed": false, "validate_args_context": { "argument_spec_name": "main", "name": "gfs2", "path": "/tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2", "type": "role" } } MSG: The arg spec validation passed TASK [fedora.linux_system_roles.gfs2 : Set platform/version specific variables] *** task path: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/main.yml:3 Saturday 25 October 2025 18:05:55 -0400 (0:00:00.016) 0:00:01.082 ****** included: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.gfs2 : Ensure ansible_facts used by role] ****** task path: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/set_vars.yml:2 Saturday 25 October 2025 18:05:55 -0400 (0:00:00.018) 0:00:01.100 ****** skipping: [managed-node1] => { "changed": false, "false_condition": "__gfs2_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.gfs2 : Check if system is ostree] ************** task path: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/set_vars.yml:10 Saturday 25 October 2025 18:05:55 -0400 (0:00:00.031) 0:00:01.131 ****** ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.gfs2 : Set flag to indicate system is ostree] *** task path: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/set_vars.yml:15 Saturday 25 October 2025 18:05:56 -0400 (0:00:00.409) 0:00:01.541 ****** ok: [managed-node1] => { "ansible_facts": { "__gfs2_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.gfs2 : Set platform/version specific variables] *** task path: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/set_vars.yml:19 Saturday 25 October 2025 18:05:56 -0400 (0:00:00.020) 0:00:01.562 ****** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_9.yml) => { "ansible_facts": { "__gfs2_repos": [ { "id": "resilientstorage", "name": "ResilientStorage" } ] }, "ansible_included_var_files": [ "/tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node1] => (item=CentOS_9.yml) => { "ansible_facts": { "__gfs2_repos": [ { "id": "resilientstorage", "name": "ResilientStorage" } ] }, "ansible_included_var_files": [ "/tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.gfs2 : Check if role is supported on current architecture] *** task path: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/main.yml:6 Saturday 25 October 2025 18:05:56 -0400 (0:00:00.040) 0:00:01.602 ****** skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_facts[\"architecture\"] in [\"aarch64\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.gfs2 : Install required packages] ************** task path: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/main.yml:13 Saturday 25 October 2025 18:05:56 -0400 (0:00:00.015) 0:00:01.618 ****** included: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/install-packages.yml for managed-node1 TASK [fedora.linux_system_roles.gfs2 : Find environment-specific tasks to enable repositories] *** task path: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/install-packages.yml:3 Saturday 25 October 2025 18:05:56 -0400 (0:00:00.017) 0:00:01.635 ****** ok: [managed-node1] => (item=RedHat.yml) => { "ansible_facts": { "__gfs2_enable_repo_tasks_file": "/tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/enable-repositories/RedHat.yml" }, "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml" } ok: [managed-node1] => (item=CentOS.yml) => { "ansible_facts": { "__gfs2_enable_repo_tasks_file": "/tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/enable-repositories/CentOS.yml" }, "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml" } skipping: [managed-node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__gfs2_enable_repo_tasks_file_candidate is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__gfs2_enable_repo_tasks_file_candidate is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.gfs2 : Run environment-specific tasks to enable repositories] *** task path: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/install-packages.yml:23 Saturday 25 October 2025 18:05:56 -0400 (0:00:00.059) 0:00:01.695 ****** included: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/enable-repositories/CentOS.yml for managed-node1 TASK [fedora.linux_system_roles.gfs2 : List active CentOS repositories] ******** task path: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/enable-repositories/CentOS.yml:3 Saturday 25 October 2025 18:05:56 -0400 (0:00:00.041) 0:00:01.737 ****** ok: [managed-node1] => { "changed": false, "cmd": [ "dnf", "repolist" ], "delta": "0:00:00.191064", "end": "2025-10-25 18:05:56.961264", "rc": 0, "start": "2025-10-25 18:05:56.770200" } STDOUT: repo id repo name appstream CentOS Stream 9 - AppStream baseos CentOS Stream 9 - BaseOS beaker-client Beaker Client - RedHatEnterpriseLinux9 beaker-harness Beaker harness beakerlib-libraries Copr repo for beakerlib-libraries owned by bgoncalv copr:copr.devel.redhat.com:lpol:qa-tools Copr repo for qa-tools owned by lpol extras-common CentOS Stream 9 - Extras packages highavailability CentOS Stream 9 - HighAvailability TASK [fedora.linux_system_roles.gfs2 : Enable CentOS repositories] ************* task path: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/enable-repositories/CentOS.yml:9 Saturday 25 October 2025 18:05:57 -0400 (0:00:00.607) 0:00:02.344 ****** changed: [managed-node1] => (item={'id': 'resilientstorage', 'name': 'ResilientStorage'}) => { "ansible_loop_var": "item", "changed": true, "cmd": [ "dnf", "config-manager", "--set-enabled", "resilientstorage" ], "delta": "0:00:00.186940", "end": "2025-10-25 18:05:57.496640", "item": { "id": "resilientstorage", "name": "ResilientStorage" }, "rc": 0, "start": "2025-10-25 18:05:57.309700" } TASK [fedora.linux_system_roles.gfs2 : Install packages] *********************** task path: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/install-packages.yml:29 Saturday 25 October 2025 18:05:57 -0400 (0:00:00.538) 0:00:02.883 ****** fatal: [managed-node1]: FAILED! => { "changed": false, "rc": 1, "results": [] } MSG: Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried TASK [Check role error on unsupported arch] ************************************ task path: /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/tests/gfs2/tests_default.yml:19 Saturday 25 October 2025 18:06:02 -0400 (0:00:05.071) 0:00:07.955 ****** fatal: [managed-node1]: FAILED! => { "assertion": "__msg in ansible_failed_result.msg", "changed": false, "evaluated_to": false } MSG: Assertion failed PLAY RECAP ********************************************************************* managed-node1 : ok=12 changed=1 unreachable=0 failed=1 skipped=2 rescued=1 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.17.14", "end_time": "2025-10-25T22:06:02.618382+00:00Z", "host": "managed-node1", "message": "Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried", "rc": 1, "start_time": "2025-10-25T22:05:57.551307+00:00Z", "task_name": "Install packages", "task_path": "/tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/install-packages.yml:29" }, { "ansible_version": "2.17.14", "end_time": "2025-10-25T22:06:02.636243+00:00Z", "host": "managed-node1", "message": "Assertion failed", "start_time": "2025-10-25T22:06:02.623169+00:00Z", "task_name": "Check role error on unsupported arch", "task_path": "/tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/tests/gfs2/tests_default.yml:19" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Saturday 25 October 2025 18:06:02 -0400 (0:00:00.014) 0:00:07.969 ****** =============================================================================== fedora.linux_system_roles.gfs2 : Install packages ----------------------- 5.07s /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/install-packages.yml:29 Gathering Facts --------------------------------------------------------- 1.01s /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/tests/gfs2/tests_default.yml:3 fedora.linux_system_roles.gfs2 : List active CentOS repositories -------- 0.61s /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/enable-repositories/CentOS.yml:3 fedora.linux_system_roles.gfs2 : Enable CentOS repositories ------------- 0.54s /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/enable-repositories/CentOS.yml:9 fedora.linux_system_roles.gfs2 : Check if system is ostree -------------- 0.41s /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/set_vars.yml:10 fedora.linux_system_roles.gfs2 : Find environment-specific tasks to enable repositories --- 0.06s /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/install-packages.yml:3 fedora.linux_system_roles.gfs2 : Run environment-specific tasks to enable repositories --- 0.04s /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/install-packages.yml:23 fedora.linux_system_roles.gfs2 : Set platform/version specific variables --- 0.04s /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/set_vars.yml:19 Run the role ------------------------------------------------------------ 0.04s /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/tests/gfs2/tests_default.yml:15 fedora.linux_system_roles.gfs2 : Ensure ansible_facts used by role ------ 0.03s /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/set_vars.yml:2 fedora.linux_system_roles.gfs2 : Set flag to indicate system is ostree --- 0.02s /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/set_vars.yml:15 fedora.linux_system_roles.gfs2 : Set platform/version specific variables --- 0.02s /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/main.yml:3 fedora.linux_system_roles.gfs2 : Install required packages -------------- 0.02s /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/main.yml:13 fedora.linux_system_roles.gfs2 : Validating arguments against arg spec 'main' - The gfs2 role. --- 0.02s /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/tests/gfs2/tests_default.yml:3 fedora.linux_system_roles.gfs2 : Check if role is supported on current architecture --- 0.02s /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/roles/gfs2/tasks/main.yml:6 Check role error on unsupported arch ------------------------------------ 0.01s /tmp/collections-CuQ/ansible_collections/fedora/linux_system_roles/tests/gfs2/tests_default.yml:19 Oct 25 18:05:54 managed-node1 sshd-session[7823]: Accepted publickey for root from 10.31.44.217 port 40308 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 25 18:05:54 managed-node1 systemd-logind[593]: New session 13 of user root. ░░ Subject: A new session 13 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 13 has been created for the user root. ░░ ░░ The leading process of the session is 7823. Oct 25 18:05:54 managed-node1 systemd[1]: Started Session 13 of User root. ░░ Subject: A start job for unit session-13.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-13.scope has finished successfully. ░░ ░░ The job identifier is 1453. Oct 25 18:05:54 managed-node1 sshd-session[7823]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 25 18:05:54 managed-node1 sshd-session[7826]: Received disconnect from 10.31.44.217 port 40308:11: disconnected by user Oct 25 18:05:54 managed-node1 sshd-session[7826]: Disconnected from user root 10.31.44.217 port 40308 Oct 25 18:05:54 managed-node1 sshd-session[7823]: pam_unix(sshd:session): session closed for user root Oct 25 18:05:54 managed-node1 systemd[1]: session-13.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-13.scope has successfully entered the 'dead' state. Oct 25 18:05:54 managed-node1 systemd-logind[593]: Session 13 logged out. Waiting for processes to exit. Oct 25 18:05:54 managed-node1 systemd-logind[593]: Removed session 13. ░░ Subject: Session 13 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 13 has been terminated. Oct 25 18:05:55 managed-node1 python3.9[8024]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 25 18:05:56 managed-node1 python3.9[8199]: ansible-ansible.builtin.stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 25 18:05:56 managed-node1 python3.9[8348]: ansible-ansible.legacy.command Invoked with _raw_params=dnf repolist _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 25 18:05:57 managed-node1 python3.9[8498]: ansible-ansible.legacy.command Invoked with _raw_params=dnf config-manager --set-enabled resilientstorage _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 25 18:05:58 managed-node1 python3.9[8648]: ansible-ansible.legacy.dnf Invoked with name=['dlm', 'lvm2-lockd', 'gfs2-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 25 18:06:02 managed-node1 sshd-session[8708]: Accepted publickey for root from 10.31.44.217 port 47814 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 25 18:06:02 managed-node1 systemd-logind[593]: New session 14 of user root. ░░ Subject: A new session 14 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 14 has been created for the user root. ░░ ░░ The leading process of the session is 8708. Oct 25 18:06:02 managed-node1 systemd[1]: Started Session 14 of User root. ░░ Subject: A start job for unit session-14.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-14.scope has finished successfully. ░░ ░░ The job identifier is 1522. Oct 25 18:06:02 managed-node1 sshd-session[8708]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 25 18:06:02 managed-node1 sshd-session[8711]: Received disconnect from 10.31.44.217 port 47814:11: disconnected by user Oct 25 18:06:02 managed-node1 sshd-session[8711]: Disconnected from user root 10.31.44.217 port 47814 Oct 25 18:06:02 managed-node1 sshd-session[8708]: pam_unix(sshd:session): session closed for user root Oct 25 18:06:02 managed-node1 systemd[1]: session-14.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-14.scope has successfully entered the 'dead' state. Oct 25 18:06:02 managed-node1 systemd-logind[593]: Session 14 logged out. Waiting for processes to exit. Oct 25 18:06:02 managed-node1 systemd-logind[593]: Removed session 14. ░░ Subject: Session 14 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 14 has been terminated. Oct 25 18:06:02 managed-node1 sshd-session[8736]: Accepted publickey for root from 10.31.44.217 port 47828 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 25 18:06:02 managed-node1 systemd-logind[593]: New session 15 of user root. ░░ Subject: A new session 15 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 15 has been created for the user root. ░░ ░░ The leading process of the session is 8736. Oct 25 18:06:02 managed-node1 systemd[1]: Started Session 15 of User root. ░░ Subject: A start job for unit session-15.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-15.scope has finished successfully. ░░ ░░ The job identifier is 1591. Oct 25 18:06:02 managed-node1 sshd-session[8736]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0)