ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml statically imported: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_test_data.yml statically imported: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/check_candlepin.yml statically imported: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml statically imported: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_proxy.yml ****************************************************** 1 plays in /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tests_proxy.yml PLAY [Basic proxy test] ******************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tests_proxy.yml:3 Saturday 21 March 2026 15:34:56 -0400 (0:00:00.023) 0:00:00.024 ******** ok: [managed-node1] META: ran handlers TASK [Get LSR_RHC_TEST_DATA environment variable] ****************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_test_data.yml:3 Saturday 21 March 2026 15:34:57 -0400 (0:00:00.859) 0:00:00.883 ******** ok: [managed-node1] => { "ansible_facts": { "lsr_rhc_test_data_file": "" }, "changed": false } TASK [Import test data] ******************************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_test_data.yml:12 Saturday 21 March 2026 15:34:57 -0400 (0:00:00.036) 0:00:00.920 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get facts for external test data] **************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_test_data.yml:16 Saturday 21 March 2026 15:34:57 -0400 (0:00:00.034) 0:00:00.954 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set local lsr_rhc_test_data] ********************************************* task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_test_data.yml:24 Saturday 21 March 2026 15:34:57 -0400 (0:00:00.033) 0:00:00.987 ******** ok: [managed-node1] => { "ansible_facts": { "lsr_rhc_test_data": { "baseurl": "http://localhost:8080", "candlepin_host": "candlepin.local", "candlepin_insecure": false, "candlepin_port": 8443, "candlepin_prefix": "/candlepin", "env_nonworking": "Ceci n'est pas une environment", "envs_register": [ "Environment 2" ], "insights": false, "proxy_auth_hostname": "localhost", "proxy_auth_password": "proxypass", "proxy_auth_port": 3130, "proxy_auth_scheme": "https", "proxy_auth_username": "proxyuser", "proxy_noauth_hostname": "localhost", "proxy_noauth_port": 3128, "proxy_noauth_scheme": "https", "proxy_nonworking_hostname": "wrongproxy", "proxy_nonworking_password": "wrong-proxypassword", "proxy_nonworking_port": 4000, "proxy_nonworking_username": "wrong-proxyuser", "reg_activation_keys": [ "default_key" ], "reg_invalid_password": "invalid-password", "reg_invalid_username": "invalid-user", "reg_organization": "donaldduck", "reg_password": "password", "reg_username": "doc", "release": null, "repositories": [ { "name": "donaldy-content-label-7051", "state": "enabled" }, { "name": "content-label-32060", "state": "disabled" } ] } }, "ansible_included_var_files": [ "/tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/../files/candlepin_data.yml" ], "changed": false } TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_test_data.yml:32 Saturday 21 March 2026 15:34:57 -0400 (0:00:00.037) 0:00:01.025 ******** ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_test_data.yml:37 Saturday 21 March 2026 15:34:57 -0400 (0:00:00.452) 0:00:01.477 ******** ok: [managed-node1] => { "ansible_facts": { "__rhc_is_ostree": false }, "changed": false } TASK [Set flag to indicate use of external proxy] ****************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_test_data.yml:41 Saturday 21 March 2026 15:34:57 -0400 (0:00:00.038) 0:00:01.515 ******** ok: [managed-node1] => { "ansible_facts": { "__rhc_external_proxy_url": "", "__rhc_use_external_proxy": false }, "changed": false } TASK [Set rhc_external_proxy to empty when external proxy is not used] ********* task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_test_data.yml:46 Saturday 21 March 2026 15:34:57 -0400 (0:00:00.038) 0:00:01.553 ******** ok: [managed-node1] => { "ansible_facts": { "rhc_external_proxy": {} }, "changed": false } TASK [Parse and set rhc_external_proxy] **************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_test_data.yml:54 Saturday 21 March 2026 15:34:57 -0400 (0:00:00.036) 0:00:01.590 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Add proxy vars to bashrc] ************************************************ task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_test_data.yml:63 Saturday 21 March 2026 15:34:57 -0400 (0:00:00.010) 0:00:01.600 ******** skipping: [managed-node1] => (item=export http_proxy=) => { "ansible_loop_var": "item", "changed": false, "item": "export http_proxy=", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=export https_proxy=) => { "ansible_loop_var": "item", "changed": false, "item": "export https_proxy=", "skip_reason": "Conditional result was False" } TASK [Get facts for external test data] **************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:9 Saturday 21 March 2026 15:34:57 -0400 (0:00:00.035) 0:00:01.635 ******** ok: [managed-node1] TASK [Set helper fact for Candlepin base URL] ********************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:17 Saturday 21 March 2026 15:34:58 -0400 (0:00:00.458) 0:00:02.094 ******** ok: [managed-node1] => { "ansible_facts": { "_cp_url": "https://candlepin.local:8443/candlepin" }, "changed": false } TASK [Set helper fact for Candlepin owner URL] ********************************* task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:21 Saturday 21 March 2026 15:34:58 -0400 (0:00:00.037) 0:00:02.132 ******** ok: [managed-node1] => { "ansible_facts": { "_cp_url_owner": "https://candlepin.local:8443/candlepin/owners/donaldduck" }, "changed": false } TASK [Add candlepin hostname to /etc/hosts] ************************************ task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:25 Saturday 21 March 2026 15:34:58 -0400 (0:00:00.037) 0:00:02.169 ******** changed: [managed-node1] => { "backup": "", "changed": true } MSG: line added TASK [Install needed packages] ************************************************* task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:31 Saturday 21 March 2026 15:34:58 -0400 (0:00:00.435) 0:00:02.604 ******** changed: [managed-node1] => { "changed": true, "rc": 0, "results": [ "Installed: podman-gvproxy-3:4.9.4-0.1.module_el8+971+3d3df00d.x86_64", "Installed: podman-3:4.9.4-0.1.module_el8+971+3d3df00d.x86_64", "Installed: libnet-1.1.6-15.el8.x86_64", "Installed: podman-catatonit-3:4.9.4-0.1.module_el8+971+3d3df00d.x86_64", "Installed: podman-plugins-3:4.9.4-0.1.module_el8+971+3d3df00d.x86_64", "Installed: containernetworking-plugins-1:1.4.0-2.module_el8+974+0c52b299.x86_64", "Installed: runc-1:1.1.12-1.module_el8+885+7da147f3.x86_64", "Installed: conmon-3:2.1.10-1.module_el8+804+f131391c.x86_64", "Installed: fuse-common-3.3.0-19.el8.x86_64", "Installed: shadow-utils-subid-2:4.6-22.el8.x86_64", "Installed: criu-3.18-4.module_el8+804+f131391c.x86_64", "Installed: container-selinux-2:2.229.0-2.module_el8+847+7863d4e6.noarch", "Installed: dnsmasq-2.79-33.el8.x86_64", "Installed: libslirp-4.4.0-1.module_el8+804+f131391c.x86_64", "Installed: protobuf-c-1.3.0-8.el8.x86_64", "Installed: slirp4netns-1.2.3-1.module_el8+951+32019cde.x86_64", "Installed: fuse3-libs-3.3.0-19.el8.x86_64", "Installed: fuse3-3.3.0-19.el8.x86_64", "Installed: containers-common-2:1-81.module_el8+968+fbb249c7.x86_64", "Installed: fuse-overlayfs-1.13-1.module_el8+804+f131391c.x86_64" ] } lsrpackages: podman TASK [Clean up Candlepin container] ******************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:38 Saturday 21 March 2026 15:35:48 -0400 (0:00:49.468) 0:00:52.073 ******** included: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/teardown_candlepin.yml for managed-node1 TASK [Check if the candlepin container exists] ********************************* task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/teardown_candlepin.yml:6 Saturday 21 March 2026 15:35:48 -0400 (0:00:00.043) 0:00:52.117 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "podman", "ps", "-a", "--filter", "name=candlepin" ], "delta": "0:00:01.172059", "end": "2026-03-21 15:35:50.008505", "rc": 0, "start": "2026-03-21 15:35:48.836446" } STDOUT: CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES TASK [Ensure that Candlepin container doesn't exist] *************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/teardown_candlepin.yml:17 Saturday 21 March 2026 15:35:50 -0400 (0:00:01.640) 0:00:53.757 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Start Candlepin container] *********************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:41 Saturday 21 March 2026 15:35:50 -0400 (0:00:00.036) 0:00:53.793 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "podman", "run", "--rm", "--detach", "--hostname", "candlepin.local", "--name", "candlepin", "--publish", "8443:8443", "--publish", "8080:8080", "ghcr.io/candlepin/candlepin-unofficial" ], "delta": "0:00:15.421175", "end": "2026-03-21 15:36:05.832482", "rc": 0, "start": "2026-03-21 15:35:50.411307" } STDOUT: 1ffd6cfbe6aaadee733c4eafe0e07b8ae8362945be499db582d4e77802f7b833 STDERR: Trying to pull ghcr.io/candlepin/candlepin-unofficial:latest... Getting image source signatures Copying blob sha256:5baae3f93712d079b6030b8c02b29acecd6a7a6cdce52ab304b31425a048be6b Copying config sha256:6c8d0128d946443dc2cb0b755129351b01ff7b7c65670349e7d53b40a05309c5 Writing manifest to image destination TASK [Ensure directories exist] ************************************************ task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:62 Saturday 21 March 2026 15:36:05 -0400 (0:00:15.852) 0:01:09.646 ******** ok: [managed-node1] => (item=/etc/pki/product) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": "/etc/pki/product", "mode": "0755", "owner": "root", "path": "/etc/pki/product", "secontext": "unconfined_u:object_r:cert_t:s0", "size": 6, "state": "directory", "uid": 0 } ok: [managed-node1] => (item=/etc/pki/product-default) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": "/etc/pki/product-default", "mode": "0755", "owner": "root", "path": "/etc/pki/product-default", "secontext": "unconfined_u:object_r:cert_t:s0", "size": 6, "state": "directory", "uid": 0 } ok: [managed-node1] => (item=/etc/rhsm/ca) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": "/etc/rhsm/ca", "mode": "0755", "owner": "root", "path": "/etc/rhsm/ca", "secontext": "system_u:object_r:rhsmcertd_config_t:s0", "size": 68, "state": "directory", "uid": 0 } TASK [Copy product certificates] *********************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:72 Saturday 21 March 2026 15:36:10 -0400 (0:00:04.329) 0:01:13.975 ******** ok: [managed-node1] => (item=7050) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "podman", "cp", "candlepin:/home/candlepin/devel/candlepin/generated_certs/7050.pem", "/etc/pki/product-default/" ], "delta": "0:00:00.683301", "end": "2026-03-21 15:36:12.326676", "item": "7050", "rc": 0, "start": "2026-03-21 15:36:11.643375" } TASK [Copy Candlepin CA certificate for subscription-manager] ****************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:83 Saturday 21 March 2026 15:36:12 -0400 (0:00:02.339) 0:01:16.315 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "podman", "cp", "candlepin:/etc/candlepin/certs/candlepin-ca.crt", "/etc/rhsm/ca/candlepin-ca.pem" ], "delta": "0:00:00.408837", "end": "2026-03-21 15:36:14.535654", "rc": 0, "start": "2026-03-21 15:36:14.126817" } TASK [Copy Candlepin CA certificate for system] ******************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:92 Saturday 21 March 2026 15:36:14 -0400 (0:00:02.010) 0:01:18.325 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "podman", "cp", "candlepin:/etc/candlepin/certs/candlepin-ca.crt", "/etc/pki/ca-trust/source/anchors/candlepin-ca.pem" ], "delta": "0:00:00.431811", "end": "2026-03-21 15:36:16.221960", "rc": 0, "start": "2026-03-21 15:36:15.790149" } TASK [Update system certificates store] **************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:101 Saturday 21 March 2026 15:36:16 -0400 (0:00:01.754) 0:01:20.080 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "update-ca-trust", "extract" ], "delta": "0:00:01.817364", "end": "2026-03-21 15:36:19.031905", "rc": 0, "start": "2026-03-21 15:36:17.214541" } TASK [Wait for started Candlepin] ********************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:108 Saturday 21 March 2026 15:36:19 -0400 (0:00:02.803) 0:01:22.883 ******** ok: [managed-node1] => { "attempts": 1, "changed": false, "connection": "close", "content_type": "application/json", "cookies": {}, "cookies_string": "", "date": "Sat, 21 Mar 2026 19:36:32 GMT", "elapsed": 12, "redirected": true, "status": 200, "transfer_encoding": "chunked", "url": "https://candlepin.local:8443/candlepin/", "vary": "accept-encoding", "x_candlepin_request_uuid": "b1e5a1c6-8db5-4871-ba1e-625913e51b43", "x_version": "4.7.3-1" } MSG: OK (unknown bytes) TASK [Install GPG key for RPM repositories] ************************************ task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:118 Saturday 21 March 2026 15:36:33 -0400 (0:00:13.906) 0:01:36.790 ******** changed: [managed-node1] => { "changed": true, "checksum_dest": null, "checksum_src": "e535dabdc941afb531fa9bb75b9a98d22bca8b81", "dest": "/etc/pki/rpm-gpg/RPM-GPG-KEY-candlepin", "elapsed": 0, "gid": 0, "group": "root", "md5sum": "eeaf1f5c1d5537f19a46506be9014ae6", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:cert_t:s0", "size": 1660, "src": "/root/.ansible/tmp/ansible-tmp-1774121793.1764724-10725-186113065592624/tmpw43x6p2d", "state": "file", "status_code": 200, "uid": 0, "url": "http://candlepin.local:8080/RPM-GPG-KEY-candlepin" } MSG: OK (1660 bytes) TASK [Add environments] ******************************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:127 Saturday 21 March 2026 15:36:33 -0400 (0:00:00.555) 0:01:37.346 ******** skipping: [managed-node1] => (item={'name': 'Environment 1', 'desc': 'The environment 1', 'id': 'envId1'}) => { "ansible_loop_var": "item", "changed": false, "item": { "desc": "The environment 1", "id": "envId1", "name": "Environment 1" }, "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item={'name': 'Environment 2', 'desc': 'The environment 2', 'id': 'envId2'}) => { "ansible_loop_var": "item", "changed": false, "item": { "desc": "The environment 2", "id": "envId2", "name": "Environment 2" }, "skip_reason": "Conditional result was False" } TASK [Check Candlepin works] *************************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/check_candlepin.yml:3 Saturday 21 March 2026 15:36:33 -0400 (0:00:00.045) 0:01:37.391 ******** ok: [managed-node1] => { "changed": false, "connection": "close", "content_type": "application/json", "cookies": {}, "cookies_string": "", "date": "Sat, 21 Mar 2026 19:36:33 GMT", "elapsed": 0, "redirected": true, "status": 200, "transfer_encoding": "chunked", "url": "https://candlepin.local:8443/candlepin/", "vary": "accept-encoding", "x_candlepin_request_uuid": "f69bb04b-2ee2-4a25-b4a0-f28dc03052f5", "x_version": "4.7.3-1" } MSG: OK (unknown bytes) TASK [Install packages for squid] ********************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:7 Saturday 21 March 2026 15:36:34 -0400 (0:00:00.549) 0:01:37.941 ******** changed: [managed-node1] => { "changed": true, "rc": 0, "results": [ "Installed: apr-util-bdb-1.6.1-9.el8.x86_64", "Installed: perl-Math-Complex-1.59-422.el8.noarch", "Installed: squid-7:4.15-10.module_el8+997+5764cec8.x86_64", "Installed: httpd-tools-2.4.37-64.module_el8+965+1ad5c49d.x86_64", "Installed: perl-Digest-SHA-1:6.02-1.el8.x86_64", "Installed: apr-util-openssl-1.6.1-9.el8.x86_64", "Installed: libtool-ltdl-2.4.6-25.el8.x86_64", "Installed: libecap-1.0.1-2.module_el8+660+c5a9a808.x86_64", "Installed: apr-1.6.3-12.el8.x86_64", "Installed: perl-DBI-1.641-4.module_el8+332+132e4365.x86_64", "Installed: apr-util-1.6.1-9.el8.x86_64", "Installed: perl-Math-BigInt-1:1.9998.11-7.el8.noarch" ] } lsrpackages: httpd-tools squid TASK [Check the status of the backup of configuration] ************************* task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:15 Saturday 21 March 2026 15:36:40 -0400 (0:00:06.215) 0:01:44.156 ******** ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } TASK [Backup the configuration] ************************************************ task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:20 Saturday 21 March 2026 15:36:40 -0400 (0:00:00.482) 0:01:44.639 ******** changed: [managed-node1] => { "changed": true, "checksum": "03416f7b93f3c21eedb46d4e75c2ccd76be402e4", "dest": "/etc/squid/squid.conf.BACKUP", "gid": 0, "group": "root", "md5sum": "d5d9b333b227e203ea890877e8587e84", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:squid_conf_t:s0", "size": 2482, "src": "/etc/squid/squid.conf", "state": "file", "uid": 0 } TASK [Copy the pristine configuration back] ************************************ task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:29 Saturday 21 March 2026 15:36:41 -0400 (0:00:00.480) 0:01:45.120 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Open the Candlepin port] ************************************************* task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:38 Saturday 21 March 2026 15:36:41 -0400 (0:00:00.043) 0:01:45.164 ******** changed: [managed-node1] => { "backup": "", "changed": true } MSG: line added TASK [Set the shutdown lifetime] *********************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:47 Saturday 21 March 2026 15:36:41 -0400 (0:00:00.382) 0:01:45.546 ******** changed: [managed-node1] => { "backup": "", "changed": true } MSG: line added TASK [Set the port] ************************************************************ task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:57 Saturday 21 March 2026 15:36:42 -0400 (0:00:00.369) 0:01:45.916 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [Create the new passwd file] ********************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:66 Saturday 21 March 2026 15:36:42 -0400 (0:00:00.364) 0:01:46.281 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set the port] ************************************************************ task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:78 Saturday 21 March 2026 15:36:42 -0400 (0:00:00.037) 0:01:46.318 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Disable HTTP access allow] *********************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:84 Saturday 21 March 2026 15:36:42 -0400 (0:00:00.035) 0:01:46.354 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Insert initial auth config] ********************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:90 Saturday 21 March 2026 15:36:42 -0400 (0:00:00.039) 0:01:46.394 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Add authenticated acl] *************************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:103 Saturday 21 March 2026 15:36:42 -0400 (0:00:00.036) 0:01:46.431 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Allow authenticated acl] ************************************************* task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:111 Saturday 21 March 2026 15:36:42 -0400 (0:00:00.036) 0:01:46.467 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Restart squid] *********************************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:119 Saturday 21 March 2026 15:36:42 -0400 (0:00:00.036) 0:01:46.503 ******** changed: [managed-node1] => { "changed": true, "name": "squid", "state": "started", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "network.target basic.target systemd-journald.socket nss-lookup.target sysinit.target system.slice network-online.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Squid caching proxy", "DevicePolicy": "auto", "Documentation": "man:squid(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "EnvironmentFiles": "/etc/sysconfig/squid (ignore_errors=no)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecReload": "{ path=/usr/bin/kill ; argv[]=/usr/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/sbin/squid ; argv[]=/usr/sbin/squid --foreground $SQUID_OPTS -f ${SQUID_CONF} ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartPre": "{ path=/usr/libexec/squid/cache_swap.sh ; argv[]=/usr/libexec/squid/cache_swap.sh ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/squid.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "squid.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "mixed", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "16384", "LimitNOFILESoft": "16384", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "squid.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "all", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/squid.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target system.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "notify", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "disabled", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Add SELinux policy for proxy ports] ************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tests_proxy.yml:25 Saturday 21 March 2026 15:36:43 -0400 (0:00:00.802) 0:01:47.305 ******** ERROR! the role 'fedora.linux_system_roles.selinux' was not found in fedora.linux_system_roles:ansible.legacy:/tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/roles:/root/.ansible/roles:/usr/share/ansible/roles:/etc/ansible/roles:/tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc The error appears to be in '/tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tests_proxy.yml': line 27, column 19, but may be elsewhere in the file depending on the exact syntax problem. The offending line appears to be: include_role: name: fedora.linux_system_roles.selinux ^ here TASK [Unregister] ************************************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tests_proxy.yml:342 Saturday 21 March 2026 15:36:43 -0400 (0:00:00.034) 0:01:47.339 ******** included: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/run_role_with_clear_facts.yml for managed-node1 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/run_role_with_clear_facts.yml:22 Saturday 21 March 2026 15:36:43 -0400 (0:00:00.037) 0:01:47.376 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/run_role_with_clear_facts.yml:32 Saturday 21 March 2026 15:36:43 -0400 (0:00:00.038) 0:01:47.415 ******** TASK [fedora.linux_system_roles.rhc : Set ansible_facts required by role] ****** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/roles/rhc/tasks/main.yml:3 Saturday 21 March 2026 15:36:43 -0400 (0:00:00.062) 0:01:47.478 ******** included: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/roles/rhc/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.rhc : Ensure ansible_facts used by role] ******* task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/roles/rhc/tasks/set_vars.yml:3 Saturday 21 March 2026 15:36:43 -0400 (0:00:00.050) 0:01:47.528 ******** ok: [managed-node1] TASK [fedora.linux_system_roles.rhc : Check if system is ostree] *************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/roles/rhc/tasks/set_vars.yml:11 Saturday 21 March 2026 15:36:44 -0400 (0:00:00.543) 0:01:48.071 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.rhc : Set flag to indicate system is ostree] *** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/roles/rhc/tasks/set_vars.yml:16 Saturday 21 March 2026 15:36:44 -0400 (0:00:00.014) 0:01:48.086 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.rhc : Check if insights-packages are installed] *** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/roles/rhc/tasks/set_vars.yml:20 Saturday 21 March 2026 15:36:44 -0400 (0:00:00.014) 0:01:48.100 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.rhc : Handle insights unregistration] ********** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/roles/rhc/tasks/main.yml:6 Saturday 21 March 2026 15:36:44 -0400 (0:00:00.013) 0:01:48.114 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.rhc : Handle system subscription] ************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/roles/rhc/tasks/main.yml:15 Saturday 21 March 2026 15:36:44 -0400 (0:00:00.014) 0:01:48.129 ******** included: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/roles/rhc/tasks/subscription-manager.yml for managed-node1 TASK [fedora.linux_system_roles.rhc : Ensure required packages are installed] *** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/roles/rhc/tasks/subscription-manager.yml:3 Saturday 21 March 2026 15:36:44 -0400 (0:00:00.025) 0:01:48.154 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.rhc : Get subscription status] ***************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/roles/rhc/tasks/subscription-manager.yml:10 Saturday 21 March 2026 15:36:44 -0400 (0:00:00.015) 0:01:48.169 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.rhc : Call subscription-manager] *************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/roles/rhc/tasks/subscription-manager.yml:23 Saturday 21 March 2026 15:36:44 -0400 (0:00:00.020) 0:01:48.190 ******** ok: [managed-node1] => { "changed": false } MSG: System already unregistered. TASK [fedora.linux_system_roles.rhc : Set or unset the release] **************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/roles/rhc/tasks/subscription-manager.yml:49 Saturday 21 March 2026 15:36:45 -0400 (0:00:00.858) 0:01:49.048 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.rhc : Configure repositories] ****************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/roles/rhc/tasks/subscription-manager.yml:58 Saturday 21 March 2026 15:36:45 -0400 (0:00:00.019) 0:01:49.067 ******** TASK [fedora.linux_system_roles.rhc : Handle insights registration] ************ task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/roles/rhc/tasks/main.yml:18 Saturday 21 March 2026 15:36:45 -0400 (0:00:00.012) 0:01:49.079 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up Candlepin container] ******************************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tests_proxy.yml:347 Saturday 21 March 2026 15:36:45 -0400 (0:00:00.014) 0:01:49.094 ******** included: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/teardown_candlepin.yml for managed-node1 TASK [Check if the candlepin container exists] ********************************* task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/teardown_candlepin.yml:6 Saturday 21 March 2026 15:36:45 -0400 (0:00:00.040) 0:01:49.134 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "podman", "ps", "-a", "--filter", "name=candlepin" ], "delta": "0:00:00.036212", "end": "2026-03-21 15:36:45.770675", "rc": 0, "start": "2026-03-21 15:36:45.734463" } STDOUT: CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 1ffd6cfbe6aa ghcr.io/candlepin/candlepin-unofficial:latest /sbin/init 40 seconds ago Up 40 seconds 0.0.0.0:8080->8080/tcp, 0.0.0.0:8443->8443/tcp candlepin TASK [Ensure that Candlepin container doesn't exist] *************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/teardown_candlepin.yml:17 Saturday 21 March 2026 15:36:45 -0400 (0:00:00.418) 0:01:49.552 ******** changed: [managed-node1] => { "changed": true, "cmd": [ "podman", "stop", "candlepin" ], "delta": "0:00:00.967666", "end": "2026-03-21 15:36:47.127496", "rc": 0, "start": "2026-03-21 15:36:46.159830" } STDOUT: candlepin TASK [Remove SELinux policy for proxy ports] *********************************** task path: /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tests_proxy.yml:350 Saturday 21 March 2026 15:36:47 -0400 (0:00:01.340) 0:01:50.892 ******** ERROR! the role 'fedora.linux_system_roles.selinux' was not found in fedora.linux_system_roles:ansible.legacy:/tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/roles:/root/.ansible/roles:/usr/share/ansible/roles:/etc/ansible/roles:/tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc The error appears to be in '/tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tests_proxy.yml': line 352, column 19, but may be elsewhere in the file depending on the exact syntax problem. The offending line appears to be: include_role: name: fedora.linux_system_roles.selinux ^ here PLAY RECAP ********************************************************************* managed-node1 : ok=38 changed=9 unreachable=0 failed=0 skipped=23 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Saturday 21 March 2026 15:36:47 -0400 (0:00:00.038) 0:01:50.931 ******** =============================================================================== Install needed packages ------------------------------------------------ 49.47s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:31 Start Candlepin container ---------------------------------------------- 15.85s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:41 Wait for started Candlepin --------------------------------------------- 13.91s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:108 Install packages for squid ---------------------------------------------- 6.22s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:7 Ensure directories exist ------------------------------------------------ 4.33s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:62 Update system certificates store ---------------------------------------- 2.80s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:101 Copy product certificates ----------------------------------------------- 2.34s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:72 Copy Candlepin CA certificate for subscription-manager ------------------ 2.01s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:83 Copy Candlepin CA certificate for system -------------------------------- 1.75s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:92 Check if the candlepin container exists --------------------------------- 1.64s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/teardown_candlepin.yml:6 Ensure that Candlepin container doesn't exist --------------------------- 1.34s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/teardown_candlepin.yml:17 Gathering Facts --------------------------------------------------------- 0.86s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tests_proxy.yml:3 fedora.linux_system_roles.rhc : Call subscription-manager --------------- 0.86s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/roles/rhc/tasks/subscription-manager.yml:23 Restart squid ----------------------------------------------------------- 0.80s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:119 Install GPG key for RPM repositories ------------------------------------ 0.56s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:118 Check Candlepin works --------------------------------------------------- 0.55s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/check_candlepin.yml:3 fedora.linux_system_roles.rhc : Ensure ansible_facts used by role ------- 0.54s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/roles/rhc/tasks/set_vars.yml:3 Check the status of the backup of configuration ------------------------- 0.48s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:15 Backup the configuration ------------------------------------------------ 0.48s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_squid.yml:20 Get facts for external test data ---------------------------------------- 0.46s /tmp/collections-GyO/ansible_collections/fedora/linux_system_roles/tests/rhc/tasks/setup_candlepin.yml:9 -- Logs begin at Sat 2026-03-21 15:27:24 EDT, end at Sat 2026-03-21 15:36:47 EDT. -- Mar 21 15:34:55 managed-node1 sshd[10156]: Received disconnect from 10.31.10.139 port 35878:11: disconnected by user Mar 21 15:34:55 managed-node1 sshd[10156]: Disconnected from user root 10.31.10.139 port 35878 Mar 21 15:34:55 managed-node1 sshd[10153]: pam_unix(sshd:session): session closed for user root Mar 21 15:34:55 managed-node1 systemd[1]: session-14.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-14.scope has successfully entered the 'dead' state. Mar 21 15:34:55 managed-node1 systemd-logind[594]: Session 14 logged out. Waiting for processes to exit. Mar 21 15:34:55 managed-node1 systemd-logind[594]: Removed session 14. -- Subject: Session 14 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 14 has been terminated. Mar 21 15:34:56 managed-node1 sudo[10318]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibdukmiqqlbmvpaytysodnplotrigjdg ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121696.3614268-9579-156227760202938/AnsiballZ_setup.py' Mar 21 15:34:56 managed-node1 sudo[10318]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:34:56 managed-node1 platform-python[10321]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=* fact_path=/etc/ansible/facts.d Mar 21 15:34:57 managed-node1 sudo[10318]: pam_unix(sudo:session): session closed for user root Mar 21 15:34:57 managed-node1 sudo[10469]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-befknvolhnglhakohvwhjtwmenygcddz ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121697.3833997-9599-122639522602676/AnsiballZ_stat.py' Mar 21 15:34:57 managed-node1 sudo[10469]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:34:57 managed-node1 platform-python[10472]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 21 15:34:57 managed-node1 sudo[10469]: pam_unix(sudo:session): session closed for user root Mar 21 15:34:58 managed-node1 sudo[10595]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orvuuapexhsfpumowwqthezknoyksitv ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121697.994805-9625-185450703175241/AnsiballZ_setup.py' Mar 21 15:34:58 managed-node1 sudo[10595]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:34:58 managed-node1 platform-python[10598]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version'] gather_timeout=10 filter=* fact_path=/etc/ansible/facts.d Mar 21 15:34:58 managed-node1 sudo[10595]: pam_unix(sudo:session): session closed for user root Mar 21 15:34:58 managed-node1 sudo[10725]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsvlvdmglykxqihrzuxbutapcetukvnd ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121698.5285966-9638-269166762844379/AnsiballZ_lineinfile.py' Mar 21 15:34:58 managed-node1 sudo[10725]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:34:58 managed-node1 platform-python[10728]: ansible-lineinfile Invoked with path=/etc/hosts line=127.0.0.1 candlepin.local regexp=.*candlepin.local state=present backrefs=False create=False backup=False firstmatch=False follow=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None src=None force=None content=NOT_LOGGING_PARAMETER remote_src=None delimiter=None directory_mode=None Mar 21 15:34:58 managed-node1 sudo[10725]: pam_unix(sudo:session): session closed for user root Mar 21 15:34:59 managed-node1 sudo[10851]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyslifwmifaedvpahvvzazabgdudjqpl ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121698.997814-9654-250021591786398/AnsiballZ_dnf.py' Mar 21 15:34:59 managed-node1 sudo[10851]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:34:59 managed-node1 platform-python[10854]: ansible-dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True lock_timeout=30 conf_file=None disable_excludes=None download_dir=None list=None releasever=None Mar 21 15:35:04 managed-node1 dbus-daemon[596]: [system] Reloaded configuration Mar 21 15:35:05 managed-node1 setsebool[10885]: The virt_use_nfs policy boolean was changed to 1 by root Mar 21 15:35:05 managed-node1 setsebool[10885]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root Mar 21 15:35:21 managed-node1 kernel: SELinux: Converting 389 SID table entries... Mar 21 15:35:21 managed-node1 kernel: SELinux: policy capability network_peer_controls=1 Mar 21 15:35:21 managed-node1 kernel: SELinux: policy capability open_perms=1 Mar 21 15:35:21 managed-node1 kernel: SELinux: policy capability extended_socket_class=1 Mar 21 15:35:21 managed-node1 kernel: SELinux: policy capability always_check_network=0 Mar 21 15:35:21 managed-node1 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 21 15:35:21 managed-node1 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 21 15:35:21 managed-node1 dbus-daemon[596]: [system] Reloaded configuration Mar 21 15:35:21 managed-node1 kernel: fuse: init (API version 7.34) Mar 21 15:35:21 managed-node1 systemd[1]: Mounting FUSE Control File System... -- Subject: Unit sys-fs-fuse-connections.mount has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit sys-fs-fuse-connections.mount has begun starting up. Mar 21 15:35:21 managed-node1 systemd[1]: Mounted FUSE Control File System. -- Subject: Unit sys-fs-fuse-connections.mount has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit sys-fs-fuse-connections.mount has finished starting up. -- -- The start-up result is done. Mar 21 15:35:22 managed-node1 dbus-daemon[596]: [system] Reloaded configuration Mar 21 15:35:22 managed-node1 dbus-daemon[596]: [system] Reloaded configuration Mar 21 15:35:47 managed-node1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. -- Subject: Unit run-rfb401914afa742fcb5321c66c2fb79f7.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit run-rfb401914afa742fcb5321c66c2fb79f7.service has finished starting up. -- -- The start-up result is done. Mar 21 15:35:47 managed-node1 systemd[1]: Starting man-db-cache-update.service... -- Subject: Unit man-db-cache-update.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has begun starting up. Mar 21 15:35:47 managed-node1 systemd[1]: Reloading. Mar 21 15:35:48 managed-node1 sudo[10851]: pam_unix(sudo:session): session closed for user root Mar 21 15:35:48 managed-node1 systemd[1]: man-db-cache-update.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit man-db-cache-update.service has successfully entered the 'dead' state. Mar 21 15:35:48 managed-node1 systemd[1]: Started man-db-cache-update.service. -- Subject: Unit man-db-cache-update.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has finished starting up. -- -- The start-up result is done. Mar 21 15:35:48 managed-node1 systemd[1]: run-rfb401914afa742fcb5321c66c2fb79f7.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit run-rfb401914afa742fcb5321c66c2fb79f7.service has successfully entered the 'dead' state. Mar 21 15:35:48 managed-node1 sudo[13361]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjjdoaukhyoqtipshpzgrrorgjzouzfw ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121748.4791205-10055-77801853009699/AnsiballZ_command.py' Mar 21 15:35:48 managed-node1 sudo[13361]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:35:48 managed-node1 platform-python[13364]: ansible-command Invoked with argv=['podman', 'ps', '-a', '--filter', 'name=candlepin'] warn=True _uses_shell=False stdin_add_newline=True strip_empty_ends=True _raw_params=None chdir=None executable=None creates=None removes=None stdin=None Mar 21 15:35:49 managed-node1 kernel: evm: overlay not supported Mar 21 15:35:50 managed-node1 systemd[1]: var-lib-containers-storage-overlay.mount: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Mar 21 15:35:50 managed-node1 sudo[13361]: pam_unix(sudo:session): session closed for user root Mar 21 15:35:50 managed-node1 sudo[13497]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joibngpzsrngqamnomlkmzicteiepgcd ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121750.1580307-10089-63104685498161/AnsiballZ_command.py' Mar 21 15:35:50 managed-node1 sudo[13497]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:35:50 managed-node1 platform-python[13500]: ansible-command Invoked with argv=['podman', 'run', '--rm', '--detach', '--hostname', 'candlepin.local', '--name', 'candlepin', '--publish', '8443:8443', '--publish', '8080:8080', 'ghcr.io/candlepin/candlepin-unofficial'] warn=True _uses_shell=False stdin_add_newline=True strip_empty_ends=True _raw_params=None chdir=None executable=None creates=None removes=None stdin=None Mar 21 15:36:05 managed-node1 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 21 15:36:05 managed-node1 NetworkManager[662]: [1774121765.1135] manager: (cni-podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/3) Mar 21 15:36:05 managed-node1 systemd-udevd[13530]: Using default interface naming scheme 'rhel-8.0'. Mar 21 15:36:05 managed-node1 systemd-udevd[13530]: link_config: autonegotiation is unset or enabled, the speed and duplex are not writable. Mar 21 15:36:05 managed-node1 systemd-udevd[13530]: Could not generate persistent MAC address for cni-podman0: No such file or directory Mar 21 15:36:05 managed-node1 NetworkManager[662]: [1774121765.1400] manager: (veth33ed73a2): new Veth device (/org/freedesktop/NetworkManager/Devices/4) Mar 21 15:36:05 managed-node1 systemd-udevd[13533]: link_config: autonegotiation is unset or enabled, the speed and duplex are not writable. Mar 21 15:36:05 managed-node1 kernel: IPv6: ADDRCONF(NETDEV_UP): veth33ed73a2: link is not ready Mar 21 15:36:05 managed-node1 systemd-udevd[13533]: Could not generate persistent MAC address for veth33ed73a2: No such file or directory Mar 21 15:36:05 managed-node1 kernel: cni-podman0: port 1(veth33ed73a2) entered blocking state Mar 21 15:36:05 managed-node1 kernel: cni-podman0: port 1(veth33ed73a2) entered disabled state Mar 21 15:36:05 managed-node1 kernel: device veth33ed73a2 entered promiscuous mode Mar 21 15:36:05 managed-node1 NetworkManager[662]: [1774121765.1466] device (cni-podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Mar 21 15:36:05 managed-node1 NetworkManager[662]: [1774121765.1470] device (cni-podman0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Mar 21 15:36:05 managed-node1 NetworkManager[662]: [1774121765.1476] device (cni-podman0): Activation: starting connection 'cni-podman0' (11ec960a-5783-4e48-9291-dcedcd2e7192) Mar 21 15:36:05 managed-node1 NetworkManager[662]: [1774121765.1477] device (cni-podman0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Mar 21 15:36:05 managed-node1 NetworkManager[662]: [1774121765.1479] device (cni-podman0): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Mar 21 15:36:05 managed-node1 NetworkManager[662]: [1774121765.1481] device (cni-podman0): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Mar 21 15:36:05 managed-node1 NetworkManager[662]: [1774121765.1482] device (cni-podman0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Mar 21 15:36:05 managed-node1 dbus-daemon[596]: [system] Activating via systemd: service name='org.freedesktop.nm_dispatcher' unit='dbus-org.freedesktop.nm-dispatcher.service' requested by ':1.5' (uid=0 pid=662 comm="/usr/sbin/NetworkManager --no-daemon " label="system_u:system_r:NetworkManager_t:s0") Mar 21 15:36:05 managed-node1 systemd[1]: Starting Network Manager Script Dispatcher Service... -- Subject: Unit NetworkManager-dispatcher.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit NetworkManager-dispatcher.service has begun starting up. Mar 21 15:36:05 managed-node1 kernel: IPv6: ADDRCONF(NETDEV_UP): eth0: link is not ready Mar 21 15:36:05 managed-node1 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Mar 21 15:36:05 managed-node1 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): veth33ed73a2: link becomes ready Mar 21 15:36:05 managed-node1 kernel: cni-podman0: port 1(veth33ed73a2) entered blocking state Mar 21 15:36:05 managed-node1 kernel: cni-podman0: port 1(veth33ed73a2) entered forwarding state Mar 21 15:36:05 managed-node1 NetworkManager[662]: [1774121765.2072] device (veth33ed73a2): carrier: link connected Mar 21 15:36:05 managed-node1 dbus-daemon[596]: [system] Successfully activated service 'org.freedesktop.nm_dispatcher' Mar 21 15:36:05 managed-node1 systemd[1]: Started Network Manager Script Dispatcher Service. -- Subject: Unit NetworkManager-dispatcher.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit NetworkManager-dispatcher.service has finished starting up. -- -- The start-up result is done. Mar 21 15:36:05 managed-node1 NetworkManager[662]: [1774121765.2095] device (cni-podman0): carrier: link connected Mar 21 15:36:05 managed-node1 NetworkManager[662]: [1774121765.2108] device (cni-podman0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Mar 21 15:36:05 managed-node1 NetworkManager[662]: [1774121765.2109] device (cni-podman0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Mar 21 15:36:05 managed-node1 NetworkManager[662]: [1774121765.2113] device (cni-podman0): Activation: successful, device activated. Mar 21 15:36:05 managed-node1 systemd[1]: Created slice machine.slice. -- Subject: Unit machine.slice has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit machine.slice has finished starting up. -- -- The start-up result is done. Mar 21 15:36:05 managed-node1 systemd[1]: Started libpod-conmon-1ffd6cfbe6aaadee733c4eafe0e07b8ae8362945be499db582d4e77802f7b833.scope. -- Subject: Unit libpod-conmon-1ffd6cfbe6aaadee733c4eafe0e07b8ae8362945be499db582d4e77802f7b833.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit libpod-conmon-1ffd6cfbe6aaadee733c4eafe0e07b8ae8362945be499db582d4e77802f7b833.scope has finished starting up. -- -- The start-up result is done. Mar 21 15:36:05 managed-node1 systemd[1]: Started libcontainer container 1ffd6cfbe6aaadee733c4eafe0e07b8ae8362945be499db582d4e77802f7b833. -- Subject: Unit libpod-1ffd6cfbe6aaadee733c4eafe0e07b8ae8362945be499db582d4e77802f7b833.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit libpod-1ffd6cfbe6aaadee733c4eafe0e07b8ae8362945be499db582d4e77802f7b833.scope has finished starting up. -- -- The start-up result is done. Mar 21 15:36:05 managed-node1 systemd[1]: run-runc-1ffd6cfbe6aaadee733c4eafe0e07b8ae8362945be499db582d4e77802f7b833-runc.dcT7js.mount: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit run-runc-1ffd6cfbe6aaadee733c4eafe0e07b8ae8362945be499db582d4e77802f7b833-runc.dcT7js.mount has successfully entered the 'dead' state. Mar 21 15:36:05 managed-node1 sudo[13497]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:06 managed-node1 sudo[13912]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzslyffditbwltezahqmpwfdqbdjaqsv ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121766.0813673-10237-174534189598920/AnsiballZ_file.py' Mar 21 15:36:06 managed-node1 sudo[13912]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:07 managed-node1 platform-python[13915]: ansible-file Invoked with path=/etc/pki/product state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None content=NOT_LOGGING_PARAMETER backup=None remote_src=None regexp=None delimiter=None directory_mode=None Mar 21 15:36:07 managed-node1 sudo[13912]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:08 managed-node1 sudo[14038]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbtvyyixamoldubmfecqirrmkwkelzni ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121767.3738947-10237-152536731325891/AnsiballZ_file.py' Mar 21 15:36:08 managed-node1 sudo[14038]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:08 managed-node1 platform-python[14041]: ansible-file Invoked with path=/etc/pki/product-default state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None content=NOT_LOGGING_PARAMETER backup=None remote_src=None regexp=None delimiter=None directory_mode=None Mar 21 15:36:08 managed-node1 sudo[14038]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:09 managed-node1 sudo[14164]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzjmumqhvdewmhqbnzhglrwsyiuzwhwv ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121768.7964683-10237-27040819076014/AnsiballZ_file.py' Mar 21 15:36:09 managed-node1 sudo[14164]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:09 managed-node1 platform-python[14167]: ansible-file Invoked with path=/etc/rhsm/ca state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None content=NOT_LOGGING_PARAMETER backup=None remote_src=None regexp=None delimiter=None directory_mode=None Mar 21 15:36:09 managed-node1 sudo[14164]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:11 managed-node1 sudo[14290]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtctkwmyxnzucvcqqduezgtttycghfpt ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121770.5380514-10306-78511008037772/AnsiballZ_command.py' Mar 21 15:36:11 managed-node1 sudo[14290]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:11 managed-node1 platform-python[14293]: ansible-command Invoked with argv=['podman', 'cp', 'candlepin:/home/candlepin/devel/candlepin/generated_certs/7050.pem', '/etc/pki/product-default/'] warn=True _uses_shell=False stdin_add_newline=True strip_empty_ends=True _raw_params=None chdir=None executable=None creates=None removes=None stdin=None Mar 21 15:36:12 managed-node1 sudo[14290]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:13 managed-node1 sudo[14451]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waqncmnuvmyweevjybcjkcrtndbkfcld ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121772.9203403-10364-152043189348210/AnsiballZ_command.py' Mar 21 15:36:13 managed-node1 sudo[14451]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:14 managed-node1 platform-python[14454]: ansible-command Invoked with argv=['podman', 'cp', 'candlepin:/etc/candlepin/certs/candlepin-ca.crt', '/etc/rhsm/ca/candlepin-ca.pem'] warn=True _uses_shell=False stdin_add_newline=True strip_empty_ends=True _raw_params=None chdir=None executable=None creates=None removes=None stdin=None Mar 21 15:36:14 managed-node1 sudo[14451]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:15 managed-node1 systemd[1]: NetworkManager-dispatcher.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit NetworkManager-dispatcher.service has successfully entered the 'dead' state. Mar 21 15:36:15 managed-node1 sudo[14614]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpfivawwrfsadogbrdgktarkbimccunm ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121774.8193057-10419-55190051882797/AnsiballZ_command.py' Mar 21 15:36:15 managed-node1 sudo[14614]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:15 managed-node1 platform-python[14618]: ansible-command Invoked with argv=['podman', 'cp', 'candlepin:/etc/candlepin/certs/candlepin-ca.crt', '/etc/pki/ca-trust/source/anchors/candlepin-ca.pem'] warn=True _uses_shell=False stdin_add_newline=True strip_empty_ends=True _raw_params=None chdir=None executable=None creates=None removes=None stdin=None Mar 21 15:36:16 managed-node1 sudo[14614]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:16 managed-node1 sudo[14778]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtutsssawyymkntxgqiqtwbxjrexgmhb ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121776.4944346-10442-178399795616518/AnsiballZ_command.py' Mar 21 15:36:16 managed-node1 sudo[14778]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:17 managed-node1 platform-python[14781]: ansible-command Invoked with argv=['update-ca-trust', 'extract'] warn=True _uses_shell=False stdin_add_newline=True strip_empty_ends=True _raw_params=None chdir=None executable=None creates=None removes=None stdin=None Mar 21 15:36:19 managed-node1 sudo[14778]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:20 managed-node1 sudo[14911]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnhsatmsfhrizgiwmjacydaqffhuontx ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121779.297895-10483-242972070067018/AnsiballZ_uri.py' Mar 21 15:36:20 managed-node1 sudo[14911]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:20 managed-node1 platform-python[14914]: ansible-uri Invoked with url=https://candlepin.local:8443/candlepin method=HEAD validate_certs=False force=False http_agent=ansible-httpget use_proxy=True force_basic_auth=False body_format=raw return_content=False follow_redirects=safe status_code=[200] timeout=30 headers={} follow=False unsafe_writes=False url_username=None url_password=NOT_LOGGING_PARAMETER client_cert=None client_key=None dest=None body=None src=None creates=None removes=None unix_socket=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None content=NOT_LOGGING_PARAMETER backup=None remote_src=None regexp=None delimiter=None directory_mode=None Mar 21 15:36:33 managed-node1 sudo[14911]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:33 managed-node1 sudo[15131]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uutkbuyyvwmildhjxfsdtamisqmnsshr ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121793.1764724-10725-186113065592624/AnsiballZ_get_url.py' Mar 21 15:36:33 managed-node1 sudo[15131]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:33 managed-node1 platform-python[15134]: ansible-get_url Invoked with url=http://candlepin.local:8080/RPM-GPG-KEY-candlepin dest=/etc/pki/rpm-gpg/RPM-GPG-KEY-candlepin mode=0644 force=False http_agent=ansible-httpget use_proxy=True validate_certs=True force_basic_auth=False sha256sum= checksum= timeout=10 follow=False unsafe_writes=False url_username=None url_password=NOT_LOGGING_PARAMETER client_cert=None client_key=None backup=None headers=None tmp_dest=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None src=None content=NOT_LOGGING_PARAMETER remote_src=None regexp=None delimiter=None directory_mode=None Mar 21 15:36:33 managed-node1 sudo[15131]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:33 managed-node1 sudo[15257]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgakijzjkyoyjezefuafkqmgfzldcsik ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121793.752442-10743-185405354818953/AnsiballZ_uri.py' Mar 21 15:36:33 managed-node1 sudo[15257]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:34 managed-node1 platform-python[15260]: ansible-uri Invoked with url=https://candlepin.local:8443/candlepin method=HEAD validate_certs=False force=False http_agent=ansible-httpget use_proxy=True force_basic_auth=False body_format=raw return_content=False follow_redirects=safe status_code=[200] timeout=30 headers={} follow=False unsafe_writes=False url_username=None url_password=NOT_LOGGING_PARAMETER client_cert=None client_key=None dest=None body=None src=None creates=None removes=None unix_socket=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None content=NOT_LOGGING_PARAMETER backup=None remote_src=None regexp=None delimiter=None directory_mode=None Mar 21 15:36:34 managed-node1 sudo[15257]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:34 managed-node1 sudo[15383]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pliabxsgthrxiyffvsxnqisafxybooyq ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121794.40559-10861-10754227072033/AnsiballZ_dnf.py' Mar 21 15:36:34 managed-node1 sudo[15383]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:34 managed-node1 platform-python[15386]: ansible-dnf Invoked with name=['squid', 'httpd-tools'] state=present allow_downgrade=False autoremove=False bugfix=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True lock_timeout=30 conf_file=None disable_excludes=None download_dir=None list=None releasever=None Mar 21 15:36:37 managed-node1 groupadd[15413]: group added to /etc/group: name=squid, GID=23 Mar 21 15:36:37 managed-node1 groupadd[15413]: group added to /etc/gshadow: name=squid Mar 21 15:36:37 managed-node1 groupadd[15413]: new group: name=squid, GID=23 Mar 21 15:36:37 managed-node1 useradd[15420]: new user: name=squid, UID=23, GID=23, home=/var/spool/squid, shell=/sbin/nologin Mar 21 15:36:39 managed-node1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. -- Subject: Unit run-rc33bbeb0bf7842bb9d6f2f09b2a16061.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit run-rc33bbeb0bf7842bb9d6f2f09b2a16061.service has finished starting up. -- -- The start-up result is done. Mar 21 15:36:39 managed-node1 systemd[1]: Starting man-db-cache-update.service... -- Subject: Unit man-db-cache-update.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has begun starting up. Mar 21 15:36:39 managed-node1 systemd[1]: Reloading. Mar 21 15:36:40 managed-node1 sudo[15383]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:40 managed-node1 sudo[17353]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzbrzcottxqxejvhlmsnullauwjfhaab ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121800.5246773-11065-268185880884750/AnsiballZ_stat.py' Mar 21 15:36:40 managed-node1 sudo[17353]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:40 managed-node1 platform-python[17380]: ansible-stat Invoked with path=/etc/squid/squid.conf.BACKUP follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 21 15:36:40 managed-node1 sudo[17353]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:41 managed-node1 systemd[1]: man-db-cache-update.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit man-db-cache-update.service has successfully entered the 'dead' state. Mar 21 15:36:41 managed-node1 systemd[1]: Started man-db-cache-update.service. -- Subject: Unit man-db-cache-update.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has finished starting up. -- -- The start-up result is done. Mar 21 15:36:41 managed-node1 systemd[1]: run-rc33bbeb0bf7842bb9d6f2f09b2a16061.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit run-rc33bbeb0bf7842bb9d6f2f09b2a16061.service has successfully entered the 'dead' state. Mar 21 15:36:41 managed-node1 sudo[17938]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzuzdujdnvqvqnxvsyexljlpeqezmhrt ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121801.0041344-11085-174417681712791/AnsiballZ_copy.py' Mar 21 15:36:41 managed-node1 sudo[17938]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:41 managed-node1 platform-python[17941]: ansible-copy Invoked with src=/etc/squid/squid.conf dest=/etc/squid/squid.conf.BACKUP remote_src=True mode=0644 backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None regexp=None delimiter=None Mar 21 15:36:41 managed-node1 sudo[17938]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:41 managed-node1 sudo[18066]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfeamkiahfzerwfseockutzcxkwzexbu ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121801.540099-11106-248319624782944/AnsiballZ_lineinfile.py' Mar 21 15:36:41 managed-node1 sudo[18066]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:41 managed-node1 platform-python[18069]: ansible-lineinfile Invoked with path=/etc/squid/squid.conf regexp=^acl SSL_ports port 8443 insertbefore=^acl Safe_ports firstmatch=True line=acl SSL_ports port 8443 # Candlepin state=present backrefs=False create=False backup=False follow=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None src=None force=None content=NOT_LOGGING_PARAMETER remote_src=None delimiter=None directory_mode=None Mar 21 15:36:41 managed-node1 sudo[18066]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:42 managed-node1 sudo[18192]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxtosxosamdcsplbczcgcfphxbopewnb ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121801.910714-11119-1480981864777/AnsiballZ_lineinfile.py' Mar 21 15:36:42 managed-node1 sudo[18192]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:42 managed-node1 platform-python[18195]: ansible-lineinfile Invoked with path=/etc/squid/squid.conf regexp=^shutdown_lifetime line=shutdown_lifetime 5 seconds state=present backrefs=False create=False backup=False firstmatch=False follow=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None src=None force=None content=NOT_LOGGING_PARAMETER remote_src=None delimiter=None directory_mode=None Mar 21 15:36:42 managed-node1 sudo[18192]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:42 managed-node1 sudo[18318]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjmhlvdaodaixsmfjbhklriwyblpmdmt ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121802.280671-11130-280542291015793/AnsiballZ_lineinfile.py' Mar 21 15:36:42 managed-node1 sudo[18318]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:42 managed-node1 platform-python[18321]: ansible-lineinfile Invoked with path=/etc/squid/squid.conf regexp=^http_port line=http_port 3128 state=present backrefs=False create=False backup=False firstmatch=False follow=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None src=None force=None content=NOT_LOGGING_PARAMETER remote_src=None delimiter=None directory_mode=None Mar 21 15:36:42 managed-node1 sudo[18318]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:43 managed-node1 sudo[18444]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyvuhrvonwynwrntuzexbuhknmjcprzi ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121802.865494-11162-234196587167665/AnsiballZ_systemd.py' Mar 21 15:36:43 managed-node1 sudo[18444]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:43 managed-node1 platform-python[18447]: ansible-systemd Invoked with name=squid state=restarted daemon_reload=False daemon_reexec=False no_block=False enabled=None force=None masked=None user=None scope=None Mar 21 15:36:43 managed-node1 systemd[1]: Starting Squid caching proxy... -- Subject: Unit squid.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit squid.service has begun starting up. Mar 21 15:36:43 managed-node1 squid[18460]: Squid Parent: will start 1 kids Mar 21 15:36:43 managed-node1 squid[18460]: Squid Parent: (squid-1) process 18462 started Mar 21 15:36:43 managed-node1 systemd[1]: Started Squid caching proxy. -- Subject: Unit squid.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit squid.service has finished starting up. -- -- The start-up result is done. Mar 21 15:36:43 managed-node1 sudo[18444]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:44 managed-node1 sudo[18624]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmngkvsrwedpgjqnwsgaskfjjwxbrosj ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121803.8741217-11193-160140633063546/AnsiballZ_setup.py' Mar 21 15:36:44 managed-node1 sudo[18624]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:44 managed-node1 platform-python[18627]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version'] gather_timeout=10 filter=* fact_path=/etc/ansible/facts.d Mar 21 15:36:44 managed-node1 sudo[18624]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:44 managed-node1 sudo[18754]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcslxklapfmjwphtgzdcpxjlehdubsoz ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121804.556339-11229-142029629174634/AnsiballZ_redhat_subscription.py' Mar 21 15:36:44 managed-node1 sudo[18754]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:44 managed-node1 platform-python[18757]: ansible-community.general.redhat_subscription Invoked with state=absent force_register=False pool_ids=[] username=None password=NOT_LOGGING_PARAMETER token=NOT_LOGGING_PARAMETER server_hostname=None server_insecure=None server_prefix=None server_port=None rhsm_baseurl=None rhsm_repo_ca_cert=None auto_attach=None activationkey=NOT_LOGGING_PARAMETER org_id=None environment=None consumer_type=None consumer_name=None consumer_id=None server_proxy_hostname=None server_proxy_scheme=None server_proxy_port=None server_proxy_user=None server_proxy_password=NOT_LOGGING_PARAMETER release=None syspurpose=None Mar 21 15:36:45 managed-node1 sudo[18754]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:45 managed-node1 sudo[18882]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piexldncbkghuvzxxhuyirjyotkekbbf ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121805.476377-11255-63607579071945/AnsiballZ_command.py' Mar 21 15:36:45 managed-node1 sudo[18882]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:45 managed-node1 platform-python[18885]: ansible-command Invoked with argv=['podman', 'ps', '-a', '--filter', 'name=candlepin'] warn=True _uses_shell=False stdin_add_newline=True strip_empty_ends=True _raw_params=None chdir=None executable=None creates=None removes=None stdin=None Mar 21 15:36:45 managed-node1 sudo[18882]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:46 managed-node1 sudo[19015]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fijhwgcpykolddhsfqunehlbmzwlaxte ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1774121805.9038715-11265-3486638207303/AnsiballZ_command.py' Mar 21 15:36:46 managed-node1 sudo[19015]: pam_unix(sudo:session): session opened for user root by root(uid=0) Mar 21 15:36:46 managed-node1 platform-python[19018]: ansible-command Invoked with argv=['podman', 'stop', 'candlepin'] warn=True _uses_shell=False stdin_add_newline=True strip_empty_ends=True _raw_params=None chdir=None executable=None creates=None removes=None stdin=None Mar 21 15:36:46 managed-node1 systemd[1]: libpod-1ffd6cfbe6aaadee733c4eafe0e07b8ae8362945be499db582d4e77802f7b833.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit libpod-1ffd6cfbe6aaadee733c4eafe0e07b8ae8362945be499db582d4e77802f7b833.scope has successfully entered the 'dead' state. Mar 21 15:36:46 managed-node1 systemd[1]: libpod-1ffd6cfbe6aaadee733c4eafe0e07b8ae8362945be499db582d4e77802f7b833.scope: Consumed 47.882s CPU time -- Subject: Resources consumed by unit runtime -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit libpod-1ffd6cfbe6aaadee733c4eafe0e07b8ae8362945be499db582d4e77802f7b833.scope completed and consumed the indicated resources. Mar 21 15:36:47 managed-node1 kernel: cni-podman0: port 1(veth33ed73a2) entered disabled state Mar 21 15:36:47 managed-node1 kernel: device veth33ed73a2 left promiscuous mode Mar 21 15:36:47 managed-node1 kernel: cni-podman0: port 1(veth33ed73a2) entered disabled state Mar 21 15:36:47 managed-node1 systemd[1]: run-netns-netns\x2de217f346\x2d9c81\x2d6909\x2dddb5\x2d9af99fdc71a6.mount: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit run-netns-netns\x2de217f346\x2d9c81\x2d6909\x2dddb5\x2d9af99fdc71a6.mount has successfully entered the 'dead' state. Mar 21 15:36:47 managed-node1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1ffd6cfbe6aaadee733c4eafe0e07b8ae8362945be499db582d4e77802f7b833-userdata-shm.mount: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit var-lib-containers-storage-overlay\x2dcontainers-1ffd6cfbe6aaadee733c4eafe0e07b8ae8362945be499db582d4e77802f7b833-userdata-shm.mount has successfully entered the 'dead' state. Mar 21 15:36:47 managed-node1 systemd[1]: var-lib-containers-storage-overlay-82a743361b2ed511c7513355ba29bdd37c0054bc78f959a589649b381fe23b41-merged.mount: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit var-lib-containers-storage-overlay-82a743361b2ed511c7513355ba29bdd37c0054bc78f959a589649b381fe23b41-merged.mount has successfully entered the 'dead' state. Mar 21 15:36:47 managed-node1 systemd[1]: var-lib-containers-storage-overlay.mount: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Mar 21 15:36:47 managed-node1 sudo[19015]: pam_unix(sudo:session): session closed for user root Mar 21 15:36:47 managed-node1 systemd[1]: libpod-conmon-1ffd6cfbe6aaadee733c4eafe0e07b8ae8362945be499db582d4e77802f7b833.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit libpod-conmon-1ffd6cfbe6aaadee733c4eafe0e07b8ae8362945be499db582d4e77802f7b833.scope has successfully entered the 'dead' state. Mar 21 15:36:47 managed-node1 sshd[19169]: Accepted publickey for root from 10.31.10.139 port 38350 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Mar 21 15:36:47 managed-node1 systemd-logind[594]: New session 15 of user root. -- Subject: A new session 15 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 15 has been created for the user root. -- -- The leading process of the session is 19169. Mar 21 15:36:47 managed-node1 systemd[1]: Started Session 15 of user root. -- Subject: Unit session-15.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-15.scope has finished starting up. -- -- The start-up result is done. Mar 21 15:36:47 managed-node1 sshd[19169]: pam_unix(sshd:session): session opened for user root by (uid=0) Mar 21 15:36:47 managed-node1 sshd[19172]: Received disconnect from 10.31.10.139 port 38350:11: disconnected by user Mar 21 15:36:47 managed-node1 sshd[19172]: Disconnected from user root 10.31.10.139 port 38350 Mar 21 15:36:47 managed-node1 sshd[19169]: pam_unix(sshd:session): session closed for user root Mar 21 15:36:47 managed-node1 systemd[1]: session-15.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-15.scope has successfully entered the 'dead' state. Mar 21 15:36:47 managed-node1 systemd-logind[594]: Session 15 logged out. Waiting for processes to exit. Mar 21 15:36:47 managed-node1 systemd-logind[594]: Removed session 15. -- Subject: Session 15 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 15 has been terminated. Mar 21 15:36:47 managed-node1 sshd[19193]: Accepted publickey for root from 10.31.10.139 port 38358 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Mar 21 15:36:47 managed-node1 systemd[1]: Started Session 16 of user root. -- Subject: Unit session-16.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-16.scope has finished starting up. -- -- The start-up result is done. Mar 21 15:36:47 managed-node1 systemd-logind[594]: New session 16 of user root. -- Subject: A new session 16 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 16 has been created for the user root. -- -- The leading process of the session is 19193. Mar 21 15:36:47 managed-node1 sshd[19193]: pam_unix(sshd:session): session opened for user root by (uid=0)