Mai 07 19:25:15 devstack nova-compute[73559]: Service is starting with native threading. This is currently experimental. Do not use it in production without first testing it in pre-production. Mai 07 19:25:18 devstack nova-compute[73559]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=73559) initialize /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:44}} Mai 07 19:25:18 devstack nova-compute[73559]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=73559) initialize /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:44}} Mai 07 19:25:18 devstack nova-compute[73559]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=73559) initialize /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:44}} Mai 07 19:25:18 devstack nova-compute[73559]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs Mai 07 19:25:19 devstack nova-compute[73559]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=73559) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:25:19 devstack nova-compute[73559]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.014s {{(pid=73559) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:25:19 devstack nova-compute[73559]: INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative Mai 07 19:25:19 devstack nova-compute[73559]: WARNING nova.compute.manager [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] In native threading mode the number of concurrent builds, and snapshots should be limited to the same number. The current configuration has differing limits: max_concurrent_builds: 10, max_concurrent_snapshots: 5. Nova will use a single, overall limit of 10 for these tasks. Mai 07 19:25:19 devstack nova-compute[73559]: INFO nova.utils [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] The long task thread pool MainProcess.long_task is initialized Mai 07 19:25:19 devstack nova-compute[73559]: INFO nova.virt.driver [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Loading compute driver 'libvirt.LibvirtDriver' Mai 07 19:25:20 devstack nova-compute[73559]: INFO nova.compute.provider_config [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. Mai 07 19:25:20 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Acquiring lock "singleton_lock" {{(pid=73559) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:25:20 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Acquired lock "singleton_lock" {{(pid=73559) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:25:20 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Releasing lock "singleton_lock" {{(pid=73559) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:25:20 devstack nova-compute[73559]: WARNING oslo_service.backend._threading.service [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] no_fork=True: running service in main process Mai 07 19:25:20 devstack nova-compute[73559]: INFO nova.service [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Starting compute node (version 33.1.0) Mai 07 19:25:21 devstack nova-compute[73559]: INFO nova.utils [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] The default thread pool MainProcess.default is initialized Mai 07 19:25:21 devstack nova-compute[73559]: DEBUG nova.utils [-] Waiting for the next task {{(pid=73559) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:25:21 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Starting event thread {{(pid=73559) start /opt/stack/nova/nova/virt/libvirt/host.py:252}} Mai 07 19:25:21 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Starting connection event dispatch thread {{(pid=73559) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:553}} Mai 07 19:25:21 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Connecting to libvirt: qemu:///system {{(pid=73559) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:558}} Mai 07 19:25:22 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Registering for lifecycle events {{(pid=73559) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:564}} Mai 07 19:25:22 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Registering for connection events: {{(pid=73559) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:585}} Mai 07 19:25:22 devstack nova-compute[73559]: INFO nova.virt.libvirt.driver [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Connection event '1' reason 'None' Mai 07 19:25:22 devstack nova-compute[73559]: WARNING nova.virt.libvirt.driver [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Cannot update service status on host "devstack" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host devstack could not be found. Mai 07 19:25:22 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.volume.mount [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Initialising _HostMountState generation 0 {{(pid=73559) host_up /opt/stack/nova/nova/virt/libvirt/volume/mount.py:130}} Mai 07 19:25:36 devstack nova-compute[73559]: INFO nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Libvirt host capabilities Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 1edef36a-6b3a-4b67-b01c-d6a682c117a8 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: x86_64 Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-IBPB Mai 07 19:25:36 devstack nova-compute[73559]: AMD Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: tcp Mai 07 19:25:36 devstack nova-compute[73559]: rdma Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 12248276 Mai 07 19:25:36 devstack nova-compute[73559]: 3062069 Mai 07 19:25:36 devstack nova-compute[73559]: 0 Mai 07 19:25:36 devstack nova-compute[73559]: 0 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: none Mai 07 19:25:36 devstack nova-compute[73559]: 0 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: dac Mai 07 19:25:36 devstack nova-compute[73559]: 0 Mai 07 19:25:36 devstack nova-compute[73559]: +64055:+994 Mai 07 19:25:36 devstack nova-compute[73559]: +64055:+994 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 64 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-alpha Mai 07 19:25:36 devstack nova-compute[73559]: clipper Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 32 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-arm Mai 07 19:25:36 devstack nova-compute[73559]: virt-8.2 Mai 07 19:25:36 devstack nova-compute[73559]: virt Mai 07 19:25:36 devstack nova-compute[73559]: qcom-dc-scm-v1-bmc Mai 07 19:25:36 devstack nova-compute[73559]: mori-bmc Mai 07 19:25:36 devstack nova-compute[73559]: ast2600-evb Mai 07 19:25:36 devstack nova-compute[73559]: borzoi Mai 07 19:25:36 devstack nova-compute[73559]: tiogapass-bmc Mai 07 19:25:36 devstack nova-compute[73559]: spitz Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.7 Mai 07 19:25:36 devstack nova-compute[73559]: nuri Mai 07 19:25:36 devstack nova-compute[73559]: mcimx7d-sabre Mai 07 19:25:36 devstack nova-compute[73559]: romulus-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-3.0 Mai 07 19:25:36 devstack nova-compute[73559]: virt-5.0 Mai 07 19:25:36 devstack nova-compute[73559]: npcm750-evb Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.10 Mai 07 19:25:36 devstack nova-compute[73559]: rainier-bmc Mai 07 19:25:36 devstack nova-compute[73559]: mps3-an547 Mai 07 19:25:36 devstack nova-compute[73559]: musca-b1 Mai 07 19:25:36 devstack nova-compute[73559]: realview-pbx-a9 Mai 07 19:25:36 devstack nova-compute[73559]: versatileab Mai 07 19:25:36 devstack nova-compute[73559]: kzm Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.8 Mai 07 19:25:36 devstack nova-compute[73559]: fby35-bmc Mai 07 19:25:36 devstack nova-compute[73559]: musca-a Mai 07 19:25:36 devstack nova-compute[73559]: virt-3.1 Mai 07 19:25:36 devstack nova-compute[73559]: mcimx6ul-evk Mai 07 19:25:36 devstack nova-compute[73559]: virt-5.1 Mai 07 19:25:36 devstack nova-compute[73559]: smdkc210 Mai 07 19:25:36 devstack nova-compute[73559]: sx1 Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.11 Mai 07 19:25:36 devstack nova-compute[73559]: imx25-pdk Mai 07 19:25:36 devstack nova-compute[73559]: stm32vldiscovery Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.9 Mai 07 19:25:36 devstack nova-compute[73559]: orangepi-pc Mai 07 19:25:36 devstack nova-compute[73559]: quanta-q71l-bmc Mai 07 19:25:36 devstack nova-compute[73559]: z2 Mai 07 19:25:36 devstack nova-compute[73559]: virt-5.2 Mai 07 19:25:36 devstack nova-compute[73559]: xilinx-zynq-a9 Mai 07 19:25:36 devstack nova-compute[73559]: tosa Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an500 Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.12 Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an521 Mai 07 19:25:36 devstack nova-compute[73559]: sabrelite Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an511 Mai 07 19:25:36 devstack nova-compute[73559]: canon-a1100 Mai 07 19:25:36 devstack nova-compute[73559]: realview-eb Mai 07 19:25:36 devstack nova-compute[73559]: quanta-gbs-bmc Mai 07 19:25:36 devstack nova-compute[73559]: emcraft-sf2 Mai 07 19:25:36 devstack nova-compute[73559]: realview-pb-a8 Mai 07 19:25:36 devstack nova-compute[73559]: yosemitev2-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-7.0 Mai 07 19:25:36 devstack nova-compute[73559]: virt-4.0 Mai 07 19:25:36 devstack nova-compute[73559]: raspi1ap Mai 07 19:25:36 devstack nova-compute[73559]: palmetto-bmc Mai 07 19:25:36 devstack nova-compute[73559]: sx1-v1 Mai 07 19:25:36 devstack nova-compute[73559]: n810 Mai 07 19:25:36 devstack nova-compute[73559]: g220a-bmc Mai 07 19:25:36 devstack nova-compute[73559]: n800 Mai 07 19:25:36 devstack nova-compute[73559]: bletchley-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-7.1 Mai 07 19:25:36 devstack nova-compute[73559]: tacoma-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-4.1 Mai 07 19:25:36 devstack nova-compute[73559]: quanta-gsj Mai 07 19:25:36 devstack nova-compute[73559]: versatilepb Mai 07 19:25:36 devstack nova-compute[73559]: terrier Mai 07 19:25:36 devstack nova-compute[73559]: mainstone Mai 07 19:25:36 devstack nova-compute[73559]: realview-eb-mpcore Mai 07 19:25:36 devstack nova-compute[73559]: integratorcp Mai 07 19:25:36 devstack nova-compute[73559]: virt-7.2 Mai 07 19:25:36 devstack nova-compute[73559]: supermicrox11-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-4.2 Mai 07 19:25:36 devstack nova-compute[73559]: witherspoon-bmc Mai 07 19:25:36 devstack nova-compute[73559]: qcom-firework-bmc Mai 07 19:25:36 devstack nova-compute[73559]: mps3-an524 Mai 07 19:25:36 devstack nova-compute[73559]: kudo-bmc Mai 07 19:25:36 devstack nova-compute[73559]: vexpress-a9 Mai 07 19:25:36 devstack nova-compute[73559]: midway Mai 07 19:25:36 devstack nova-compute[73559]: musicpal Mai 07 19:25:36 devstack nova-compute[73559]: lm3s811evb Mai 07 19:25:36 devstack nova-compute[73559]: lm3s6965evb Mai 07 19:25:36 devstack nova-compute[73559]: supermicro-x11spi-bmc Mai 07 19:25:36 devstack nova-compute[73559]: microbit Mai 07 19:25:36 devstack nova-compute[73559]: fby35 Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an505 Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an385 Mai 07 19:25:36 devstack nova-compute[73559]: virt-6.0 Mai 07 19:25:36 devstack nova-compute[73559]: virt-8.0 Mai 07 19:25:36 devstack nova-compute[73559]: cubieboard Mai 07 19:25:36 devstack nova-compute[73559]: ast1030-evb Mai 07 19:25:36 devstack nova-compute[73559]: verdex Mai 07 19:25:36 devstack nova-compute[73559]: bpim2u Mai 07 19:25:36 devstack nova-compute[73559]: netduino2 Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an386 Mai 07 19:25:36 devstack nova-compute[73559]: olimex-stm32-h405 Mai 07 19:25:36 devstack nova-compute[73559]: virt-6.1 Mai 07 19:25:36 devstack nova-compute[73559]: virt-8.1 Mai 07 19:25:36 devstack nova-compute[73559]: raspi2b Mai 07 19:25:36 devstack nova-compute[73559]: vexpress-a15 Mai 07 19:25:36 devstack nova-compute[73559]: fuji-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-6.2 Mai 07 19:25:36 devstack nova-compute[73559]: sonorapass-bmc Mai 07 19:25:36 devstack nova-compute[73559]: cheetah Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.6 Mai 07 19:25:36 devstack nova-compute[73559]: ast2500-evb Mai 07 19:25:36 devstack nova-compute[73559]: highbank Mai 07 19:25:36 devstack nova-compute[73559]: akita Mai 07 19:25:36 devstack nova-compute[73559]: connex Mai 07 19:25:36 devstack nova-compute[73559]: netduinoplus2 Mai 07 19:25:36 devstack nova-compute[73559]: collie Mai 07 19:25:36 devstack nova-compute[73559]: raspi0 Mai 07 19:25:36 devstack nova-compute[73559]: fp5280g2-bmc Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 32 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-arm Mai 07 19:25:36 devstack nova-compute[73559]: virt-8.2 Mai 07 19:25:36 devstack nova-compute[73559]: virt Mai 07 19:25:36 devstack nova-compute[73559]: qcom-dc-scm-v1-bmc Mai 07 19:25:36 devstack nova-compute[73559]: mori-bmc Mai 07 19:25:36 devstack nova-compute[73559]: ast2600-evb Mai 07 19:25:36 devstack nova-compute[73559]: borzoi Mai 07 19:25:36 devstack nova-compute[73559]: tiogapass-bmc Mai 07 19:25:36 devstack nova-compute[73559]: spitz Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.7 Mai 07 19:25:36 devstack nova-compute[73559]: nuri Mai 07 19:25:36 devstack nova-compute[73559]: mcimx7d-sabre Mai 07 19:25:36 devstack nova-compute[73559]: romulus-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-3.0 Mai 07 19:25:36 devstack nova-compute[73559]: virt-5.0 Mai 07 19:25:36 devstack nova-compute[73559]: npcm750-evb Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.10 Mai 07 19:25:36 devstack nova-compute[73559]: rainier-bmc Mai 07 19:25:36 devstack nova-compute[73559]: mps3-an547 Mai 07 19:25:36 devstack nova-compute[73559]: musca-b1 Mai 07 19:25:36 devstack nova-compute[73559]: realview-pbx-a9 Mai 07 19:25:36 devstack nova-compute[73559]: versatileab Mai 07 19:25:36 devstack nova-compute[73559]: kzm Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.8 Mai 07 19:25:36 devstack nova-compute[73559]: fby35-bmc Mai 07 19:25:36 devstack nova-compute[73559]: musca-a Mai 07 19:25:36 devstack nova-compute[73559]: virt-3.1 Mai 07 19:25:36 devstack nova-compute[73559]: mcimx6ul-evk Mai 07 19:25:36 devstack nova-compute[73559]: virt-5.1 Mai 07 19:25:36 devstack nova-compute[73559]: smdkc210 Mai 07 19:25:36 devstack nova-compute[73559]: sx1 Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.11 Mai 07 19:25:36 devstack nova-compute[73559]: imx25-pdk Mai 07 19:25:36 devstack nova-compute[73559]: stm32vldiscovery Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.9 Mai 07 19:25:36 devstack nova-compute[73559]: orangepi-pc Mai 07 19:25:36 devstack nova-compute[73559]: quanta-q71l-bmc Mai 07 19:25:36 devstack nova-compute[73559]: z2 Mai 07 19:25:36 devstack nova-compute[73559]: virt-5.2 Mai 07 19:25:36 devstack nova-compute[73559]: xilinx-zynq-a9 Mai 07 19:25:36 devstack nova-compute[73559]: tosa Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an500 Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.12 Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an521 Mai 07 19:25:36 devstack nova-compute[73559]: sabrelite Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an511 Mai 07 19:25:36 devstack nova-compute[73559]: canon-a1100 Mai 07 19:25:36 devstack nova-compute[73559]: realview-eb Mai 07 19:25:36 devstack nova-compute[73559]: quanta-gbs-bmc Mai 07 19:25:36 devstack nova-compute[73559]: emcraft-sf2 Mai 07 19:25:36 devstack nova-compute[73559]: realview-pb-a8 Mai 07 19:25:36 devstack nova-compute[73559]: yosemitev2-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-7.0 Mai 07 19:25:36 devstack nova-compute[73559]: virt-4.0 Mai 07 19:25:36 devstack nova-compute[73559]: raspi1ap Mai 07 19:25:36 devstack nova-compute[73559]: palmetto-bmc Mai 07 19:25:36 devstack nova-compute[73559]: sx1-v1 Mai 07 19:25:36 devstack nova-compute[73559]: n810 Mai 07 19:25:36 devstack nova-compute[73559]: g220a-bmc Mai 07 19:25:36 devstack nova-compute[73559]: n800 Mai 07 19:25:36 devstack nova-compute[73559]: bletchley-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-7.1 Mai 07 19:25:36 devstack nova-compute[73559]: tacoma-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-4.1 Mai 07 19:25:36 devstack nova-compute[73559]: quanta-gsj Mai 07 19:25:36 devstack nova-compute[73559]: versatilepb Mai 07 19:25:36 devstack nova-compute[73559]: terrier Mai 07 19:25:36 devstack nova-compute[73559]: mainstone Mai 07 19:25:36 devstack nova-compute[73559]: realview-eb-mpcore Mai 07 19:25:36 devstack nova-compute[73559]: integratorcp Mai 07 19:25:36 devstack nova-compute[73559]: virt-7.2 Mai 07 19:25:36 devstack nova-compute[73559]: supermicrox11-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-4.2 Mai 07 19:25:36 devstack nova-compute[73559]: witherspoon-bmc Mai 07 19:25:36 devstack nova-compute[73559]: qcom-firework-bmc Mai 07 19:25:36 devstack nova-compute[73559]: mps3-an524 Mai 07 19:25:36 devstack nova-compute[73559]: kudo-bmc Mai 07 19:25:36 devstack nova-compute[73559]: vexpress-a9 Mai 07 19:25:36 devstack nova-compute[73559]: midway Mai 07 19:25:36 devstack nova-compute[73559]: musicpal Mai 07 19:25:36 devstack nova-compute[73559]: lm3s811evb Mai 07 19:25:36 devstack nova-compute[73559]: lm3s6965evb Mai 07 19:25:36 devstack nova-compute[73559]: supermicro-x11spi-bmc Mai 07 19:25:36 devstack nova-compute[73559]: microbit Mai 07 19:25:36 devstack nova-compute[73559]: fby35 Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an505 Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an385 Mai 07 19:25:36 devstack nova-compute[73559]: virt-6.0 Mai 07 19:25:36 devstack nova-compute[73559]: virt-8.0 Mai 07 19:25:36 devstack nova-compute[73559]: cubieboard Mai 07 19:25:36 devstack nova-compute[73559]: ast1030-evb Mai 07 19:25:36 devstack nova-compute[73559]: verdex Mai 07 19:25:36 devstack nova-compute[73559]: bpim2u Mai 07 19:25:36 devstack nova-compute[73559]: netduino2 Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an386 Mai 07 19:25:36 devstack nova-compute[73559]: olimex-stm32-h405 Mai 07 19:25:36 devstack nova-compute[73559]: virt-6.1 Mai 07 19:25:36 devstack nova-compute[73559]: virt-8.1 Mai 07 19:25:36 devstack nova-compute[73559]: raspi2b Mai 07 19:25:36 devstack nova-compute[73559]: vexpress-a15 Mai 07 19:25:36 devstack nova-compute[73559]: fuji-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-6.2 Mai 07 19:25:36 devstack nova-compute[73559]: sonorapass-bmc Mai 07 19:25:36 devstack nova-compute[73559]: cheetah Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.6 Mai 07 19:25:36 devstack nova-compute[73559]: ast2500-evb Mai 07 19:25:36 devstack nova-compute[73559]: highbank Mai 07 19:25:36 devstack nova-compute[73559]: akita Mai 07 19:25:36 devstack nova-compute[73559]: connex Mai 07 19:25:36 devstack nova-compute[73559]: netduinoplus2 Mai 07 19:25:36 devstack nova-compute[73559]: collie Mai 07 19:25:36 devstack nova-compute[73559]: raspi0 Mai 07 19:25:36 devstack nova-compute[73559]: fp5280g2-bmc Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 64 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-aarch64 Mai 07 19:25:36 devstack nova-compute[73559]: virt-8.2 Mai 07 19:25:36 devstack nova-compute[73559]: virt Mai 07 19:25:36 devstack nova-compute[73559]: qcom-dc-scm-v1-bmc Mai 07 19:25:36 devstack nova-compute[73559]: mori-bmc Mai 07 19:25:36 devstack nova-compute[73559]: ast2600-evb Mai 07 19:25:36 devstack nova-compute[73559]: borzoi Mai 07 19:25:36 devstack nova-compute[73559]: tiogapass-bmc Mai 07 19:25:36 devstack nova-compute[73559]: spitz Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.7 Mai 07 19:25:36 devstack nova-compute[73559]: nuri Mai 07 19:25:36 devstack nova-compute[73559]: mcimx7d-sabre Mai 07 19:25:36 devstack nova-compute[73559]: romulus-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-3.0 Mai 07 19:25:36 devstack nova-compute[73559]: virt-5.0 Mai 07 19:25:36 devstack nova-compute[73559]: npcm750-evb Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.10 Mai 07 19:25:36 devstack nova-compute[73559]: rainier-bmc Mai 07 19:25:36 devstack nova-compute[73559]: mps3-an547 Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.8 Mai 07 19:25:36 devstack nova-compute[73559]: musca-b1 Mai 07 19:25:36 devstack nova-compute[73559]: realview-pbx-a9 Mai 07 19:25:36 devstack nova-compute[73559]: versatileab Mai 07 19:25:36 devstack nova-compute[73559]: kzm Mai 07 19:25:36 devstack nova-compute[73559]: fby35-bmc Mai 07 19:25:36 devstack nova-compute[73559]: musca-a Mai 07 19:25:36 devstack nova-compute[73559]: virt-3.1 Mai 07 19:25:36 devstack nova-compute[73559]: mcimx6ul-evk Mai 07 19:25:36 devstack nova-compute[73559]: virt-5.1 Mai 07 19:25:36 devstack nova-compute[73559]: smdkc210 Mai 07 19:25:36 devstack nova-compute[73559]: sx1 Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.11 Mai 07 19:25:36 devstack nova-compute[73559]: imx25-pdk Mai 07 19:25:36 devstack nova-compute[73559]: stm32vldiscovery Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.9 Mai 07 19:25:36 devstack nova-compute[73559]: orangepi-pc Mai 07 19:25:36 devstack nova-compute[73559]: quanta-q71l-bmc Mai 07 19:25:36 devstack nova-compute[73559]: z2 Mai 07 19:25:36 devstack nova-compute[73559]: virt-5.2 Mai 07 19:25:36 devstack nova-compute[73559]: xilinx-zynq-a9 Mai 07 19:25:36 devstack nova-compute[73559]: xlnx-zcu102 Mai 07 19:25:36 devstack nova-compute[73559]: tosa Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an500 Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.12 Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an521 Mai 07 19:25:36 devstack nova-compute[73559]: sabrelite Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an511 Mai 07 19:25:36 devstack nova-compute[73559]: canon-a1100 Mai 07 19:25:36 devstack nova-compute[73559]: realview-eb Mai 07 19:25:36 devstack nova-compute[73559]: quanta-gbs-bmc Mai 07 19:25:36 devstack nova-compute[73559]: emcraft-sf2 Mai 07 19:25:36 devstack nova-compute[73559]: realview-pb-a8 Mai 07 19:25:36 devstack nova-compute[73559]: sbsa-ref Mai 07 19:25:36 devstack nova-compute[73559]: yosemitev2-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-7.0 Mai 07 19:25:36 devstack nova-compute[73559]: virt-4.0 Mai 07 19:25:36 devstack nova-compute[73559]: raspi1ap Mai 07 19:25:36 devstack nova-compute[73559]: palmetto-bmc Mai 07 19:25:36 devstack nova-compute[73559]: sx1-v1 Mai 07 19:25:36 devstack nova-compute[73559]: n810 Mai 07 19:25:36 devstack nova-compute[73559]: g220a-bmc Mai 07 19:25:36 devstack nova-compute[73559]: n800 Mai 07 19:25:36 devstack nova-compute[73559]: bletchley-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-7.1 Mai 07 19:25:36 devstack nova-compute[73559]: tacoma-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-4.1 Mai 07 19:25:36 devstack nova-compute[73559]: quanta-gsj Mai 07 19:25:36 devstack nova-compute[73559]: versatilepb Mai 07 19:25:36 devstack nova-compute[73559]: terrier Mai 07 19:25:36 devstack nova-compute[73559]: mainstone Mai 07 19:25:36 devstack nova-compute[73559]: realview-eb-mpcore Mai 07 19:25:36 devstack nova-compute[73559]: integratorcp Mai 07 19:25:36 devstack nova-compute[73559]: virt-7.2 Mai 07 19:25:36 devstack nova-compute[73559]: supermicrox11-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-4.2 Mai 07 19:25:36 devstack nova-compute[73559]: witherspoon-bmc Mai 07 19:25:36 devstack nova-compute[73559]: qcom-firework-bmc Mai 07 19:25:36 devstack nova-compute[73559]: mps3-an524 Mai 07 19:25:36 devstack nova-compute[73559]: kudo-bmc Mai 07 19:25:36 devstack nova-compute[73559]: vexpress-a9 Mai 07 19:25:36 devstack nova-compute[73559]: midway Mai 07 19:25:36 devstack nova-compute[73559]: musicpal Mai 07 19:25:36 devstack nova-compute[73559]: lm3s811evb Mai 07 19:25:36 devstack nova-compute[73559]: lm3s6965evb Mai 07 19:25:36 devstack nova-compute[73559]: supermicro-x11spi-bmc Mai 07 19:25:36 devstack nova-compute[73559]: microbit Mai 07 19:25:36 devstack nova-compute[73559]: fby35 Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an505 Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an385 Mai 07 19:25:36 devstack nova-compute[73559]: virt-6.0 Mai 07 19:25:36 devstack nova-compute[73559]: virt-8.0 Mai 07 19:25:36 devstack nova-compute[73559]: raspi3ap Mai 07 19:25:36 devstack nova-compute[73559]: cubieboard Mai 07 19:25:36 devstack nova-compute[73559]: ast1030-evb Mai 07 19:25:36 devstack nova-compute[73559]: verdex Mai 07 19:25:36 devstack nova-compute[73559]: bpim2u Mai 07 19:25:36 devstack nova-compute[73559]: netduino2 Mai 07 19:25:36 devstack nova-compute[73559]: xlnx-versal-virt Mai 07 19:25:36 devstack nova-compute[73559]: mps2-an386 Mai 07 19:25:36 devstack nova-compute[73559]: olimex-stm32-h405 Mai 07 19:25:36 devstack nova-compute[73559]: virt-6.1 Mai 07 19:25:36 devstack nova-compute[73559]: virt-8.1 Mai 07 19:25:36 devstack nova-compute[73559]: raspi3b Mai 07 19:25:36 devstack nova-compute[73559]: raspi2b Mai 07 19:25:36 devstack nova-compute[73559]: vexpress-a15 Mai 07 19:25:36 devstack nova-compute[73559]: fuji-bmc Mai 07 19:25:36 devstack nova-compute[73559]: virt-6.2 Mai 07 19:25:36 devstack nova-compute[73559]: sonorapass-bmc Mai 07 19:25:36 devstack nova-compute[73559]: cheetah Mai 07 19:25:36 devstack nova-compute[73559]: virt-2.6 Mai 07 19:25:36 devstack nova-compute[73559]: ast2500-evb Mai 07 19:25:36 devstack nova-compute[73559]: highbank Mai 07 19:25:36 devstack nova-compute[73559]: akita Mai 07 19:25:36 devstack nova-compute[73559]: connex Mai 07 19:25:36 devstack nova-compute[73559]: netduinoplus2 Mai 07 19:25:36 devstack nova-compute[73559]: collie Mai 07 19:25:36 devstack nova-compute[73559]: raspi0 Mai 07 19:25:36 devstack nova-compute[73559]: fp5280g2-bmc Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 32 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-cris Mai 07 19:25:36 devstack nova-compute[73559]: axis-dev88 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 32 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-i386 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-noble-v2 Mai 07 19:25:36 devstack nova-compute[73559]: ubuntu Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-mantic Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-impish-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-5.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-lunar-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-mantic Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.12 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-xenial Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-kinetic Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-6.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-4.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-mantic-maxcpus Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.5 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-4.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-focal Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-hirsute Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-xenial Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-jammy-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-5.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.7 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-eoan-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-disco-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-groovy Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-zesty Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-lunar Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-mantic-hpb-maxcpus Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-groovy Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-7.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-artful Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-trusty Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-eoan-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-focal-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-8.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-jammy-maxcpus Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-bionic-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-mantic-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-artful Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-8.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.7 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-6.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-kinetic Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-jammy-maxcpus Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-yakkety Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.4 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-cosmic-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-7.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.10 Mai 07 19:25:36 devstack nova-compute[73559]: x-remote Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-5.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.9 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.11 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-jammy-hpb-maxcpus Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-3.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-6.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-4.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-jammy Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.4 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-4.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-eoan Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-jammy-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-5.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.9 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-bionic-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-lunar Mai 07 19:25:36 devstack nova-compute[73559]: isapc Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-mantic-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-cosmic Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.6 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-mantic-maxcpus Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-3.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-bionic Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-disco-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-cosmic Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.12 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-bionic Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-kinetic-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-groovy-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-7.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-noble-v2 Mai 07 19:25:36 devstack nova-compute[73559]: ubuntu-q35 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-lunar-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-disco Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-cosmic-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-noble Mai 07 19:25:36 devstack nova-compute[73559]: q35 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-8.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-impish Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-wily Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-8.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.6 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-6.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-impish Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-jammy Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-impish-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-hirsute Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-4.0.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-hirsute-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-7.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-5.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.8 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.10 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-3.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-6.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-zesty Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-7.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-4.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-focal Mai 07 19:25:36 devstack nova-compute[73559]: microvm Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.3 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-jammy-hpb-maxcpus Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-8.2 Mai 07 19:25:36 devstack nova-compute[73559]: q35 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-kinetic-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-focal-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-4.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-noble Mai 07 19:25:36 devstack nova-compute[73559]: pc Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-disco Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-groovy-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-8.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-hirsute-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-5.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-6.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.8 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-eoan Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.5 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-3.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-yakkety Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-mantic-hpb-maxcpus Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-7.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.11 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 32 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-m68k Mai 07 19:25:36 devstack nova-compute[73559]: mcf5208evb Mai 07 19:25:36 devstack nova-compute[73559]: virt-7.0 Mai 07 19:25:36 devstack nova-compute[73559]: an5206 Mai 07 19:25:36 devstack nova-compute[73559]: virt-6.0 Mai 07 19:25:36 devstack nova-compute[73559]: q800 Mai 07 19:25:36 devstack nova-compute[73559]: virt-8.1 Mai 07 19:25:36 devstack nova-compute[73559]: virt-7.2 Mai 07 19:25:36 devstack nova-compute[73559]: virt-6.2 Mai 07 19:25:36 devstack nova-compute[73559]: virt-8.0 Mai 07 19:25:36 devstack nova-compute[73559]: next-cube Mai 07 19:25:36 devstack nova-compute[73559]: virt-7.1 Mai 07 19:25:36 devstack nova-compute[73559]: virt-6.1 Mai 07 19:25:36 devstack nova-compute[73559]: virt-8.2 Mai 07 19:25:36 devstack nova-compute[73559]: virt Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 32 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-microblaze Mai 07 19:25:36 devstack nova-compute[73559]: petalogix-s3adsp1800 Mai 07 19:25:36 devstack nova-compute[73559]: petalogix-ml605 Mai 07 19:25:36 devstack nova-compute[73559]: xlnx-zynqmp-pmu Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 32 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-microblazeel Mai 07 19:25:36 devstack nova-compute[73559]: petalogix-s3adsp1800 Mai 07 19:25:36 devstack nova-compute[73559]: petalogix-ml605 Mai 07 19:25:36 devstack nova-compute[73559]: xlnx-zynqmp-pmu Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 32 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-mips Mai 07 19:25:36 devstack nova-compute[73559]: malta Mai 07 19:25:36 devstack nova-compute[73559]: mipssim Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 32 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-mipsel Mai 07 19:25:36 devstack nova-compute[73559]: malta Mai 07 19:25:36 devstack nova-compute[73559]: mipssim Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 64 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-mips64 Mai 07 19:25:36 devstack nova-compute[73559]: malta Mai 07 19:25:36 devstack nova-compute[73559]: mipssim Mai 07 19:25:36 devstack nova-compute[73559]: pica61 Mai 07 19:25:36 devstack nova-compute[73559]: magnum Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 64 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-mips64el Mai 07 19:25:36 devstack nova-compute[73559]: malta Mai 07 19:25:36 devstack nova-compute[73559]: loongson3-virt Mai 07 19:25:36 devstack nova-compute[73559]: mipssim Mai 07 19:25:36 devstack nova-compute[73559]: pica61 Mai 07 19:25:36 devstack nova-compute[73559]: magnum Mai 07 19:25:36 devstack nova-compute[73559]: boston Mai 07 19:25:36 devstack nova-compute[73559]: fuloong2e Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 32 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-ppc Mai 07 19:25:36 devstack nova-compute[73559]: g3beige Mai 07 19:25:36 devstack nova-compute[73559]: amigaone Mai 07 19:25:36 devstack nova-compute[73559]: virtex-ml507 Mai 07 19:25:36 devstack nova-compute[73559]: mac99 Mai 07 19:25:36 devstack nova-compute[73559]: ppce500 Mai 07 19:25:36 devstack nova-compute[73559]: sam460ex Mai 07 19:25:36 devstack nova-compute[73559]: pegasos2 Mai 07 19:25:36 devstack nova-compute[73559]: bamboo Mai 07 19:25:36 devstack nova-compute[73559]: 40p Mai 07 19:25:36 devstack nova-compute[73559]: ref405ep Mai 07 19:25:36 devstack nova-compute[73559]: mpc8544ds Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 64 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-ppc64 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-noble Mai 07 19:25:36 devstack nova-compute[73559]: pseries Mai 07 19:25:36 devstack nova-compute[73559]: amigaone Mai 07 19:25:36 devstack nova-compute[73559]: powernv9 Mai 07 19:25:36 devstack nova-compute[73559]: powernv Mai 07 19:25:36 devstack nova-compute[73559]: pseries-4.1 Mai 07 19:25:36 devstack nova-compute[73559]: mpc8544ds Mai 07 19:25:36 devstack nova-compute[73559]: pseries-6.1 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.5 Mai 07 19:25:36 devstack nova-compute[73559]: powernv10 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-xenial Mai 07 19:25:36 devstack nova-compute[73559]: pseries-4.2 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-6.2 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-yakkety Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.6 Mai 07 19:25:36 devstack nova-compute[73559]: ppce500 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-bionic-sxxm Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.7 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-3.0 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-8.0 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-5.0 Mai 07 19:25:36 devstack nova-compute[73559]: 40p Mai 07 19:25:36 devstack nova-compute[73559]: pseries-lunar Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.8 Mai 07 19:25:36 devstack nova-compute[73559]: pegasos2 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-hirsute Mai 07 19:25:36 devstack nova-compute[73559]: pseries-3.1 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-8.1 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-5.1 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-eoan Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.9 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-zesty Mai 07 19:25:36 devstack nova-compute[73559]: bamboo Mai 07 19:25:36 devstack nova-compute[73559]: pseries-groovy Mai 07 19:25:36 devstack nova-compute[73559]: pseries-focal Mai 07 19:25:36 devstack nova-compute[73559]: g3beige Mai 07 19:25:36 devstack nova-compute[73559]: pseries-8.2 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-5.2 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-disco Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.12-sxxm Mai 07 19:25:36 devstack nova-compute[73559]: pseries-mantic Mai 07 19:25:36 devstack nova-compute[73559]: pseries-kinetic Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.10 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-7.0 Mai 07 19:25:36 devstack nova-compute[73559]: virtex-ml507 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.11 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.1 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-7.1 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-cosmic Mai 07 19:25:36 devstack nova-compute[73559]: pseries-bionic Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.12 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.2 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-7.2 Mai 07 19:25:36 devstack nova-compute[73559]: mac99 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-impish Mai 07 19:25:36 devstack nova-compute[73559]: pseries-jammy Mai 07 19:25:36 devstack nova-compute[73559]: pseries-artful Mai 07 19:25:36 devstack nova-compute[73559]: sam460ex Mai 07 19:25:36 devstack nova-compute[73559]: ref405ep Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.3 Mai 07 19:25:36 devstack nova-compute[73559]: powernv8 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-4.0 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-6.0 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.4 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 64 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-ppc64le Mai 07 19:25:36 devstack nova-compute[73559]: pseries-noble Mai 07 19:25:36 devstack nova-compute[73559]: pseries Mai 07 19:25:36 devstack nova-compute[73559]: amigaone Mai 07 19:25:36 devstack nova-compute[73559]: powernv9 Mai 07 19:25:36 devstack nova-compute[73559]: powernv Mai 07 19:25:36 devstack nova-compute[73559]: pseries-4.1 Mai 07 19:25:36 devstack nova-compute[73559]: mpc8544ds Mai 07 19:25:36 devstack nova-compute[73559]: pseries-6.1 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.5 Mai 07 19:25:36 devstack nova-compute[73559]: powernv10 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-xenial Mai 07 19:25:36 devstack nova-compute[73559]: pseries-4.2 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-6.2 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-yakkety Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.6 Mai 07 19:25:36 devstack nova-compute[73559]: ppce500 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-bionic-sxxm Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.7 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-3.0 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-8.0 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-5.0 Mai 07 19:25:36 devstack nova-compute[73559]: 40p Mai 07 19:25:36 devstack nova-compute[73559]: pseries-lunar Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.8 Mai 07 19:25:36 devstack nova-compute[73559]: pegasos2 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-hirsute Mai 07 19:25:36 devstack nova-compute[73559]: pseries-3.1 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-8.1 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-5.1 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-eoan Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.9 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-zesty Mai 07 19:25:36 devstack nova-compute[73559]: bamboo Mai 07 19:25:36 devstack nova-compute[73559]: pseries-groovy Mai 07 19:25:36 devstack nova-compute[73559]: pseries-focal Mai 07 19:25:36 devstack nova-compute[73559]: g3beige Mai 07 19:25:36 devstack nova-compute[73559]: pseries-8.2 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-5.2 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-disco Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.12-sxxm Mai 07 19:25:36 devstack nova-compute[73559]: pseries-mantic Mai 07 19:25:36 devstack nova-compute[73559]: pseries-kinetic Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.10 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-7.0 Mai 07 19:25:36 devstack nova-compute[73559]: virtex-ml507 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.11 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.1 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-7.1 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-cosmic Mai 07 19:25:36 devstack nova-compute[73559]: pseries-bionic Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.12 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.2 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-7.2 Mai 07 19:25:36 devstack nova-compute[73559]: mac99 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-impish Mai 07 19:25:36 devstack nova-compute[73559]: pseries-jammy Mai 07 19:25:36 devstack nova-compute[73559]: pseries-artful Mai 07 19:25:36 devstack nova-compute[73559]: sam460ex Mai 07 19:25:36 devstack nova-compute[73559]: ref405ep Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.3 Mai 07 19:25:36 devstack nova-compute[73559]: powernv8 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-4.0 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-6.0 Mai 07 19:25:36 devstack nova-compute[73559]: pseries-2.4 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 32 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-riscv32 Mai 07 19:25:36 devstack nova-compute[73559]: virt Mai 07 19:25:36 devstack nova-compute[73559]: spike Mai 07 19:25:36 devstack nova-compute[73559]: opentitan Mai 07 19:25:36 devstack nova-compute[73559]: sifive_u Mai 07 19:25:36 devstack nova-compute[73559]: sifive_e Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 64 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-riscv64 Mai 07 19:25:36 devstack nova-compute[73559]: virt Mai 07 19:25:36 devstack nova-compute[73559]: spike Mai 07 19:25:36 devstack nova-compute[73559]: microchip-icicle-kit Mai 07 19:25:36 devstack nova-compute[73559]: sifive_u Mai 07 19:25:36 devstack nova-compute[73559]: shakti_c Mai 07 19:25:36 devstack nova-compute[73559]: sifive_e Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 64 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-s390x Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-noble Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-8.0 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-disco Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-6.0 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-2.11 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-7.0 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-mantic Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-xenial Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-5.0 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-focal Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-2.8 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-4.0 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-zesty Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-8.2 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-6.2 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-3.0 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-groovy Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-2.5 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-7.2 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-5.2 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-artful Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-hirsute Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-4.2 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-2.10 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-2.7 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-lunar Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-eoan Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-kinetic Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-8.1 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-yakkety Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-6.1 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-cosmic Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-bionic Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-2.12 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-7.1 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-2.4 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-5.1 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-2.9 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-impish Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-4.1 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-jammy Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-3.1 Mai 07 19:25:36 devstack nova-compute[73559]: s390-ccw-virtio-2.6 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 32 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-sh4 Mai 07 19:25:36 devstack nova-compute[73559]: shix Mai 07 19:25:36 devstack nova-compute[73559]: r2d Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 64 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-sh4eb Mai 07 19:25:36 devstack nova-compute[73559]: shix Mai 07 19:25:36 devstack nova-compute[73559]: r2d Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 32 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-sparc Mai 07 19:25:36 devstack nova-compute[73559]: SS-5 Mai 07 19:25:36 devstack nova-compute[73559]: SS-20 Mai 07 19:25:36 devstack nova-compute[73559]: LX Mai 07 19:25:36 devstack nova-compute[73559]: SPARCClassic Mai 07 19:25:36 devstack nova-compute[73559]: leon3_generic Mai 07 19:25:36 devstack nova-compute[73559]: SPARCbook Mai 07 19:25:36 devstack nova-compute[73559]: SS-4 Mai 07 19:25:36 devstack nova-compute[73559]: SS-600MP Mai 07 19:25:36 devstack nova-compute[73559]: SS-10 Mai 07 19:25:36 devstack nova-compute[73559]: Voyager Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 64 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-sparc64 Mai 07 19:25:36 devstack nova-compute[73559]: sun4u Mai 07 19:25:36 devstack nova-compute[73559]: niagara Mai 07 19:25:36 devstack nova-compute[73559]: sun4v Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 64 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-x86_64 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-noble-v2 Mai 07 19:25:36 devstack nova-compute[73559]: ubuntu Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-mantic Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-impish-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-5.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-lunar-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-mantic Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.12 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-xenial Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-kinetic Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-6.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-4.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-mantic-maxcpus Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.5 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-4.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-focal Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-hirsute Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-xenial Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-jammy-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-5.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.7 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-eoan-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-zesty Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-groovy Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-disco-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-lunar Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-mantic-hpb-maxcpus Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-groovy Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-7.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-artful Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-trusty Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-focal-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-8.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-eoan-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-jammy-maxcpus Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-bionic-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-mantic-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-artful Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-8.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.7 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-6.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-kinetic Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-jammy-maxcpus Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-yakkety Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.4 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-cosmic-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-7.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.10 Mai 07 19:25:36 devstack nova-compute[73559]: x-remote Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-5.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.9 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.11 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-jammy-hpb-maxcpus Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-3.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-6.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-4.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-jammy Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.4 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-4.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-eoan Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-jammy-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-5.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.9 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-bionic-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-lunar Mai 07 19:25:36 devstack nova-compute[73559]: isapc Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-mantic-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-cosmic Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.6 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-mantic-maxcpus Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-3.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-bionic Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-disco-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-cosmic Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.12 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-bionic Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-kinetic-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-groovy-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-7.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-noble-v2 Mai 07 19:25:36 devstack nova-compute[73559]: ubuntu-q35 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-lunar-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-disco Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-cosmic-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-noble Mai 07 19:25:36 devstack nova-compute[73559]: q35 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-8.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-wily Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-impish Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-8.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.6 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-6.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-impish Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-jammy Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-impish-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-hirsute Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-4.0.1 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-hirsute-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-7.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-5.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.8 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.10 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-3.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-zesty Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-7.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-4.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-6.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-focal Mai 07 19:25:36 devstack nova-compute[73559]: microvm Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.3 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-jammy-hpb-maxcpus Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-disco Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-kinetic-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-4.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-focal-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-noble Mai 07 19:25:36 devstack nova-compute[73559]: pc Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-8.2 Mai 07 19:25:36 devstack nova-compute[73559]: q35 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-groovy-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-hirsute-hpb Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-8.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-5.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-2.8 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-6.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-eoan Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.5 Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-3.0 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-yakkety Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-mantic-hpb-maxcpus Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-7.2 Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-2.11 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 32 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-xtensa Mai 07 19:25:36 devstack nova-compute[73559]: sim Mai 07 19:25:36 devstack nova-compute[73559]: kc705 Mai 07 19:25:36 devstack nova-compute[73559]: ml605 Mai 07 19:25:36 devstack nova-compute[73559]: ml605-nommu Mai 07 19:25:36 devstack nova-compute[73559]: virt Mai 07 19:25:36 devstack nova-compute[73559]: lx60-nommu Mai 07 19:25:36 devstack nova-compute[73559]: lx200 Mai 07 19:25:36 devstack nova-compute[73559]: lx200-nommu Mai 07 19:25:36 devstack nova-compute[73559]: lx60 Mai 07 19:25:36 devstack nova-compute[73559]: kc705-nommu Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: hvm Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 32 Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-xtensaeb Mai 07 19:25:36 devstack nova-compute[73559]: sim Mai 07 19:25:36 devstack nova-compute[73559]: kc705 Mai 07 19:25:36 devstack nova-compute[73559]: ml605 Mai 07 19:25:36 devstack nova-compute[73559]: ml605-nommu Mai 07 19:25:36 devstack nova-compute[73559]: virt Mai 07 19:25:36 devstack nova-compute[73559]: lx60-nommu Mai 07 19:25:36 devstack nova-compute[73559]: lx200 Mai 07 19:25:36 devstack nova-compute[73559]: lx200-nommu Mai 07 19:25:36 devstack nova-compute[73559]: lx60 Mai 07 19:25:36 devstack nova-compute[73559]: kc705-nommu Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for alpha via machine types: {None} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch alpha / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-alpha' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for armv6l via machine types: {None, 'virt'} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for armv7l via machine types: {'virt'} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch armv7l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for aarch64 via machine types: {'virt'} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch aarch64 / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-aarch64' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for cris via machine types: {None} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch cris / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-cris' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for i686 via machine types: {'q35', 'pc', 'ubuntu-q35', 'ubuntu'} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-i386 Mai 07 19:25:36 devstack nova-compute[73559]: kvm Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-noble Mai 07 19:25:36 devstack nova-compute[73559]: i686 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: /usr/share/AAVMF/AAVMF_CODE.fd Mai 07 19:25:36 devstack nova-compute[73559]: /usr/share/AAVMF/AAVMF32_CODE.fd Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: rom Mai 07 19:25:36 devstack nova-compute[73559]: pflash Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: yes Mai 07 19:25:36 devstack nova-compute[73559]: no Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: no Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: off Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: on Mai 07 19:25:36 devstack nova-compute[73559]: off Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-IBPB Mai 07 19:25:36 devstack nova-compute[73559]: AMD Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: qemu64 Mai 07 19:25:36 devstack nova-compute[73559]: qemu32 Mai 07 19:25:36 devstack nova-compute[73559]: phenom Mai 07 19:25:36 devstack nova-compute[73559]: pentium3 Mai 07 19:25:36 devstack nova-compute[73559]: pentium2 Mai 07 19:25:36 devstack nova-compute[73559]: pentium Mai 07 19:25:36 devstack nova-compute[73559]: n270 Mai 07 19:25:36 devstack nova-compute[73559]: kvm64 Mai 07 19:25:36 devstack nova-compute[73559]: kvm32 Mai 07 19:25:36 devstack nova-compute[73559]: coreduo Mai 07 19:25:36 devstack nova-compute[73559]: core2duo Mai 07 19:25:36 devstack nova-compute[73559]: athlon Mai 07 19:25:36 devstack nova-compute[73559]: Westmere-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Westmere Mai 07 19:25:36 devstack nova-compute[73559]: Snowridge Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Server-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Server-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Server Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Client-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Client-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Client Mai 07 19:25:36 devstack nova-compute[73559]: SapphireRapids Mai 07 19:25:36 devstack nova-compute[73559]: SandyBridge-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: SandyBridge Mai 07 19:25:36 devstack nova-compute[73559]: Penryn Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G5 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G4 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G3 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G2 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G1 Mai 07 19:25:36 devstack nova-compute[73559]: Nehalem-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Nehalem Mai 07 19:25:36 devstack nova-compute[73559]: IvyBridge-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: IvyBridge Mai 07 19:25:36 devstack nova-compute[73559]: Icelake-Server-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Icelake-Server Mai 07 19:25:36 devstack nova-compute[73559]: Haswell-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Haswell-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Haswell-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Haswell Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-Rome Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-Milan Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-IBPB Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-Genoa Mai 07 19:25:36 devstack nova-compute[73559]: EPYC Mai 07 19:25:36 devstack nova-compute[73559]: Dhyana Mai 07 19:25:36 devstack nova-compute[73559]: Cooperlake Mai 07 19:25:36 devstack nova-compute[73559]: Conroe Mai 07 19:25:36 devstack nova-compute[73559]: Cascadelake-Server-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Cascadelake-Server Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell Mai 07 19:25:36 devstack nova-compute[73559]: 486 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: file Mai 07 19:25:36 devstack nova-compute[73559]: anonymous Mai 07 19:25:36 devstack nova-compute[73559]: memfd Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: disk Mai 07 19:25:36 devstack nova-compute[73559]: cdrom Mai 07 19:25:36 devstack nova-compute[73559]: floppy Mai 07 19:25:36 devstack nova-compute[73559]: lun Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: fdc Mai 07 19:25:36 devstack nova-compute[73559]: scsi Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: usb Mai 07 19:25:36 devstack nova-compute[73559]: sata Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: virtio-transitional Mai 07 19:25:36 devstack nova-compute[73559]: virtio-non-transitional Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: sdl Mai 07 19:25:36 devstack nova-compute[73559]: vnc Mai 07 19:25:36 devstack nova-compute[73559]: spice Mai 07 19:25:36 devstack nova-compute[73559]: egl-headless Mai 07 19:25:36 devstack nova-compute[73559]: dbus Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: subsystem Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: default Mai 07 19:25:36 devstack nova-compute[73559]: mandatory Mai 07 19:25:36 devstack nova-compute[73559]: requisite Mai 07 19:25:36 devstack nova-compute[73559]: optional Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: usb Mai 07 19:25:36 devstack nova-compute[73559]: pci Mai 07 19:25:36 devstack nova-compute[73559]: scsi Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: virtio-transitional Mai 07 19:25:36 devstack nova-compute[73559]: virtio-non-transitional Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: random Mai 07 19:25:36 devstack nova-compute[73559]: egd Mai 07 19:25:36 devstack nova-compute[73559]: builtin Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: path Mai 07 19:25:36 devstack nova-compute[73559]: handle Mai 07 19:25:36 devstack nova-compute[73559]: virtiofs Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: tpm-tis Mai 07 19:25:36 devstack nova-compute[73559]: tpm-crb Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: passthrough Mai 07 19:25:36 devstack nova-compute[73559]: emulator Mai 07 19:25:36 devstack nova-compute[73559]: external Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 1.2 Mai 07 19:25:36 devstack nova-compute[73559]: 2.0 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: usb Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: pty Mai 07 19:25:36 devstack nova-compute[73559]: unix Mai 07 19:25:36 devstack nova-compute[73559]: spicevmc Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: qemu Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: builtin Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: relaxed Mai 07 19:25:36 devstack nova-compute[73559]: vapic Mai 07 19:25:36 devstack nova-compute[73559]: spinlocks Mai 07 19:25:36 devstack nova-compute[73559]: vpindex Mai 07 19:25:36 devstack nova-compute[73559]: runtime Mai 07 19:25:36 devstack nova-compute[73559]: synic Mai 07 19:25:36 devstack nova-compute[73559]: stimer Mai 07 19:25:36 devstack nova-compute[73559]: reset Mai 07 19:25:36 devstack nova-compute[73559]: vendor_id Mai 07 19:25:36 devstack nova-compute[73559]: frequencies Mai 07 19:25:36 devstack nova-compute[73559]: reenlightenment Mai 07 19:25:36 devstack nova-compute[73559]: tlbflush Mai 07 19:25:36 devstack nova-compute[73559]: ipi Mai 07 19:25:36 devstack nova-compute[73559]: avic Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: {{(pid=73559) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1105}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-i386 Mai 07 19:25:36 devstack nova-compute[73559]: kvm Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-noble Mai 07 19:25:36 devstack nova-compute[73559]: i686 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: /usr/share/AAVMF/AAVMF_CODE.fd Mai 07 19:25:36 devstack nova-compute[73559]: /usr/share/AAVMF/AAVMF32_CODE.fd Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: rom Mai 07 19:25:36 devstack nova-compute[73559]: pflash Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: yes Mai 07 19:25:36 devstack nova-compute[73559]: no Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: no Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: off Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: on Mai 07 19:25:36 devstack nova-compute[73559]: off Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-IBPB Mai 07 19:25:36 devstack nova-compute[73559]: AMD Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: qemu64 Mai 07 19:25:36 devstack nova-compute[73559]: qemu32 Mai 07 19:25:36 devstack nova-compute[73559]: phenom Mai 07 19:25:36 devstack nova-compute[73559]: pentium3 Mai 07 19:25:36 devstack nova-compute[73559]: pentium2 Mai 07 19:25:36 devstack nova-compute[73559]: pentium Mai 07 19:25:36 devstack nova-compute[73559]: n270 Mai 07 19:25:36 devstack nova-compute[73559]: kvm64 Mai 07 19:25:36 devstack nova-compute[73559]: kvm32 Mai 07 19:25:36 devstack nova-compute[73559]: coreduo Mai 07 19:25:36 devstack nova-compute[73559]: core2duo Mai 07 19:25:36 devstack nova-compute[73559]: athlon Mai 07 19:25:36 devstack nova-compute[73559]: Westmere-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Westmere Mai 07 19:25:36 devstack nova-compute[73559]: Snowridge Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Server-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Server-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Server Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Client-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Client-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Client Mai 07 19:25:36 devstack nova-compute[73559]: SapphireRapids Mai 07 19:25:36 devstack nova-compute[73559]: SandyBridge-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: SandyBridge Mai 07 19:25:36 devstack nova-compute[73559]: Penryn Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G5 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G4 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G3 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G2 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G1 Mai 07 19:25:36 devstack nova-compute[73559]: Nehalem-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Nehalem Mai 07 19:25:36 devstack nova-compute[73559]: IvyBridge-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: IvyBridge Mai 07 19:25:36 devstack nova-compute[73559]: Icelake-Server-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Icelake-Server Mai 07 19:25:36 devstack nova-compute[73559]: Haswell-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Haswell-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Haswell-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Haswell Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-Rome Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-Milan Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-IBPB Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-Genoa Mai 07 19:25:36 devstack nova-compute[73559]: EPYC Mai 07 19:25:36 devstack nova-compute[73559]: Dhyana Mai 07 19:25:36 devstack nova-compute[73559]: Cooperlake Mai 07 19:25:36 devstack nova-compute[73559]: Conroe Mai 07 19:25:36 devstack nova-compute[73559]: Cascadelake-Server-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Cascadelake-Server Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell Mai 07 19:25:36 devstack nova-compute[73559]: 486 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: file Mai 07 19:25:36 devstack nova-compute[73559]: anonymous Mai 07 19:25:36 devstack nova-compute[73559]: memfd Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: disk Mai 07 19:25:36 devstack nova-compute[73559]: cdrom Mai 07 19:25:36 devstack nova-compute[73559]: floppy Mai 07 19:25:36 devstack nova-compute[73559]: lun Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: ide Mai 07 19:25:36 devstack nova-compute[73559]: fdc Mai 07 19:25:36 devstack nova-compute[73559]: scsi Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: usb Mai 07 19:25:36 devstack nova-compute[73559]: sata Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: virtio-transitional Mai 07 19:25:36 devstack nova-compute[73559]: virtio-non-transitional Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: sdl Mai 07 19:25:36 devstack nova-compute[73559]: vnc Mai 07 19:25:36 devstack nova-compute[73559]: spice Mai 07 19:25:36 devstack nova-compute[73559]: egl-headless Mai 07 19:25:36 devstack nova-compute[73559]: dbus Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: subsystem Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: default Mai 07 19:25:36 devstack nova-compute[73559]: mandatory Mai 07 19:25:36 devstack nova-compute[73559]: requisite Mai 07 19:25:36 devstack nova-compute[73559]: optional Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: usb Mai 07 19:25:36 devstack nova-compute[73559]: pci Mai 07 19:25:36 devstack nova-compute[73559]: scsi Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: virtio-transitional Mai 07 19:25:36 devstack nova-compute[73559]: virtio-non-transitional Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: random Mai 07 19:25:36 devstack nova-compute[73559]: egd Mai 07 19:25:36 devstack nova-compute[73559]: builtin Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: path Mai 07 19:25:36 devstack nova-compute[73559]: handle Mai 07 19:25:36 devstack nova-compute[73559]: virtiofs Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: tpm-tis Mai 07 19:25:36 devstack nova-compute[73559]: tpm-crb Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: passthrough Mai 07 19:25:36 devstack nova-compute[73559]: emulator Mai 07 19:25:36 devstack nova-compute[73559]: external Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 1.2 Mai 07 19:25:36 devstack nova-compute[73559]: 2.0 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: usb Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: pty Mai 07 19:25:36 devstack nova-compute[73559]: unix Mai 07 19:25:36 devstack nova-compute[73559]: spicevmc Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: qemu Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: builtin Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: relaxed Mai 07 19:25:36 devstack nova-compute[73559]: vapic Mai 07 19:25:36 devstack nova-compute[73559]: spinlocks Mai 07 19:25:36 devstack nova-compute[73559]: vpindex Mai 07 19:25:36 devstack nova-compute[73559]: runtime Mai 07 19:25:36 devstack nova-compute[73559]: synic Mai 07 19:25:36 devstack nova-compute[73559]: stimer Mai 07 19:25:36 devstack nova-compute[73559]: reset Mai 07 19:25:36 devstack nova-compute[73559]: vendor_id Mai 07 19:25:36 devstack nova-compute[73559]: frequencies Mai 07 19:25:36 devstack nova-compute[73559]: reenlightenment Mai 07 19:25:36 devstack nova-compute[73559]: tlbflush Mai 07 19:25:36 devstack nova-compute[73559]: ipi Mai 07 19:25:36 devstack nova-compute[73559]: avic Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: {{(pid=73559) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1105}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu-q35: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-i386 Mai 07 19:25:36 devstack nova-compute[73559]: kvm Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-noble-v2 Mai 07 19:25:36 devstack nova-compute[73559]: i686 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: /usr/share/AAVMF/AAVMF_CODE.fd Mai 07 19:25:36 devstack nova-compute[73559]: /usr/share/AAVMF/AAVMF32_CODE.fd Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: rom Mai 07 19:25:36 devstack nova-compute[73559]: pflash Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: yes Mai 07 19:25:36 devstack nova-compute[73559]: no Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: no Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: off Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: on Mai 07 19:25:36 devstack nova-compute[73559]: off Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-IBPB Mai 07 19:25:36 devstack nova-compute[73559]: AMD Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: qemu64 Mai 07 19:25:36 devstack nova-compute[73559]: qemu32 Mai 07 19:25:36 devstack nova-compute[73559]: phenom Mai 07 19:25:36 devstack nova-compute[73559]: pentium3 Mai 07 19:25:36 devstack nova-compute[73559]: pentium2 Mai 07 19:25:36 devstack nova-compute[73559]: pentium Mai 07 19:25:36 devstack nova-compute[73559]: n270 Mai 07 19:25:36 devstack nova-compute[73559]: kvm64 Mai 07 19:25:36 devstack nova-compute[73559]: kvm32 Mai 07 19:25:36 devstack nova-compute[73559]: coreduo Mai 07 19:25:36 devstack nova-compute[73559]: core2duo Mai 07 19:25:36 devstack nova-compute[73559]: athlon Mai 07 19:25:36 devstack nova-compute[73559]: Westmere-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Westmere Mai 07 19:25:36 devstack nova-compute[73559]: Snowridge Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Server-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Server-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Server Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Client-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Client-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Client Mai 07 19:25:36 devstack nova-compute[73559]: SapphireRapids Mai 07 19:25:36 devstack nova-compute[73559]: SandyBridge-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: SandyBridge Mai 07 19:25:36 devstack nova-compute[73559]: Penryn Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G5 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G4 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G3 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G2 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G1 Mai 07 19:25:36 devstack nova-compute[73559]: Nehalem-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Nehalem Mai 07 19:25:36 devstack nova-compute[73559]: IvyBridge-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: IvyBridge Mai 07 19:25:36 devstack nova-compute[73559]: Icelake-Server-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Icelake-Server Mai 07 19:25:36 devstack nova-compute[73559]: Haswell-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Haswell-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Haswell-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Haswell Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-Rome Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-Milan Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-IBPB Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-Genoa Mai 07 19:25:36 devstack nova-compute[73559]: EPYC Mai 07 19:25:36 devstack nova-compute[73559]: Dhyana Mai 07 19:25:36 devstack nova-compute[73559]: Cooperlake Mai 07 19:25:36 devstack nova-compute[73559]: Conroe Mai 07 19:25:36 devstack nova-compute[73559]: Cascadelake-Server-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Cascadelake-Server Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell Mai 07 19:25:36 devstack nova-compute[73559]: 486 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: file Mai 07 19:25:36 devstack nova-compute[73559]: anonymous Mai 07 19:25:36 devstack nova-compute[73559]: memfd Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: disk Mai 07 19:25:36 devstack nova-compute[73559]: cdrom Mai 07 19:25:36 devstack nova-compute[73559]: floppy Mai 07 19:25:36 devstack nova-compute[73559]: lun Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: fdc Mai 07 19:25:36 devstack nova-compute[73559]: scsi Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: usb Mai 07 19:25:36 devstack nova-compute[73559]: sata Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: virtio-transitional Mai 07 19:25:36 devstack nova-compute[73559]: virtio-non-transitional Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: sdl Mai 07 19:25:36 devstack nova-compute[73559]: vnc Mai 07 19:25:36 devstack nova-compute[73559]: spice Mai 07 19:25:36 devstack nova-compute[73559]: egl-headless Mai 07 19:25:36 devstack nova-compute[73559]: dbus Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: subsystem Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: default Mai 07 19:25:36 devstack nova-compute[73559]: mandatory Mai 07 19:25:36 devstack nova-compute[73559]: requisite Mai 07 19:25:36 devstack nova-compute[73559]: optional Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: usb Mai 07 19:25:36 devstack nova-compute[73559]: pci Mai 07 19:25:36 devstack nova-compute[73559]: scsi Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: virtio-transitional Mai 07 19:25:36 devstack nova-compute[73559]: virtio-non-transitional Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: random Mai 07 19:25:36 devstack nova-compute[73559]: egd Mai 07 19:25:36 devstack nova-compute[73559]: builtin Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: path Mai 07 19:25:36 devstack nova-compute[73559]: handle Mai 07 19:25:36 devstack nova-compute[73559]: virtiofs Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: tpm-tis Mai 07 19:25:36 devstack nova-compute[73559]: tpm-crb Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: passthrough Mai 07 19:25:36 devstack nova-compute[73559]: emulator Mai 07 19:25:36 devstack nova-compute[73559]: external Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 1.2 Mai 07 19:25:36 devstack nova-compute[73559]: 2.0 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: usb Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: pty Mai 07 19:25:36 devstack nova-compute[73559]: unix Mai 07 19:25:36 devstack nova-compute[73559]: spicevmc Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: qemu Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: builtin Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: relaxed Mai 07 19:25:36 devstack nova-compute[73559]: vapic Mai 07 19:25:36 devstack nova-compute[73559]: spinlocks Mai 07 19:25:36 devstack nova-compute[73559]: vpindex Mai 07 19:25:36 devstack nova-compute[73559]: runtime Mai 07 19:25:36 devstack nova-compute[73559]: synic Mai 07 19:25:36 devstack nova-compute[73559]: stimer Mai 07 19:25:36 devstack nova-compute[73559]: reset Mai 07 19:25:36 devstack nova-compute[73559]: vendor_id Mai 07 19:25:36 devstack nova-compute[73559]: frequencies Mai 07 19:25:36 devstack nova-compute[73559]: reenlightenment Mai 07 19:25:36 devstack nova-compute[73559]: tlbflush Mai 07 19:25:36 devstack nova-compute[73559]: ipi Mai 07 19:25:36 devstack nova-compute[73559]: avic Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: {{(pid=73559) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1105}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-i386 Mai 07 19:25:36 devstack nova-compute[73559]: kvm Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-noble-v2 Mai 07 19:25:36 devstack nova-compute[73559]: i686 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: /usr/share/AAVMF/AAVMF_CODE.fd Mai 07 19:25:36 devstack nova-compute[73559]: /usr/share/AAVMF/AAVMF32_CODE.fd Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: rom Mai 07 19:25:36 devstack nova-compute[73559]: pflash Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: yes Mai 07 19:25:36 devstack nova-compute[73559]: no Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: no Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: off Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: on Mai 07 19:25:36 devstack nova-compute[73559]: off Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-IBPB Mai 07 19:25:36 devstack nova-compute[73559]: AMD Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: qemu64 Mai 07 19:25:36 devstack nova-compute[73559]: qemu32 Mai 07 19:25:36 devstack nova-compute[73559]: phenom Mai 07 19:25:36 devstack nova-compute[73559]: pentium3 Mai 07 19:25:36 devstack nova-compute[73559]: pentium2 Mai 07 19:25:36 devstack nova-compute[73559]: pentium Mai 07 19:25:36 devstack nova-compute[73559]: n270 Mai 07 19:25:36 devstack nova-compute[73559]: kvm64 Mai 07 19:25:36 devstack nova-compute[73559]: kvm32 Mai 07 19:25:36 devstack nova-compute[73559]: coreduo Mai 07 19:25:36 devstack nova-compute[73559]: core2duo Mai 07 19:25:36 devstack nova-compute[73559]: athlon Mai 07 19:25:36 devstack nova-compute[73559]: Westmere-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Westmere Mai 07 19:25:36 devstack nova-compute[73559]: Snowridge Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Server-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Server-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Server Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Client-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Client-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Client Mai 07 19:25:36 devstack nova-compute[73559]: SapphireRapids Mai 07 19:25:36 devstack nova-compute[73559]: SandyBridge-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: SandyBridge Mai 07 19:25:36 devstack nova-compute[73559]: Penryn Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G5 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G4 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G3 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G2 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G1 Mai 07 19:25:36 devstack nova-compute[73559]: Nehalem-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Nehalem Mai 07 19:25:36 devstack nova-compute[73559]: IvyBridge-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: IvyBridge Mai 07 19:25:36 devstack nova-compute[73559]: Icelake-Server-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Icelake-Server Mai 07 19:25:36 devstack nova-compute[73559]: Haswell-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Haswell-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Haswell-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Haswell Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-Rome Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-Milan Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-IBPB Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-Genoa Mai 07 19:25:36 devstack nova-compute[73559]: EPYC Mai 07 19:25:36 devstack nova-compute[73559]: Dhyana Mai 07 19:25:36 devstack nova-compute[73559]: Cooperlake Mai 07 19:25:36 devstack nova-compute[73559]: Conroe Mai 07 19:25:36 devstack nova-compute[73559]: Cascadelake-Server-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Cascadelake-Server Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell Mai 07 19:25:36 devstack nova-compute[73559]: 486 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: file Mai 07 19:25:36 devstack nova-compute[73559]: anonymous Mai 07 19:25:36 devstack nova-compute[73559]: memfd Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: disk Mai 07 19:25:36 devstack nova-compute[73559]: cdrom Mai 07 19:25:36 devstack nova-compute[73559]: floppy Mai 07 19:25:36 devstack nova-compute[73559]: lun Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: ide Mai 07 19:25:36 devstack nova-compute[73559]: fdc Mai 07 19:25:36 devstack nova-compute[73559]: scsi Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: usb Mai 07 19:25:36 devstack nova-compute[73559]: sata Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: virtio-transitional Mai 07 19:25:36 devstack nova-compute[73559]: virtio-non-transitional Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: sdl Mai 07 19:25:36 devstack nova-compute[73559]: vnc Mai 07 19:25:36 devstack nova-compute[73559]: spice Mai 07 19:25:36 devstack nova-compute[73559]: egl-headless Mai 07 19:25:36 devstack nova-compute[73559]: dbus Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: subsystem Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: default Mai 07 19:25:36 devstack nova-compute[73559]: mandatory Mai 07 19:25:36 devstack nova-compute[73559]: requisite Mai 07 19:25:36 devstack nova-compute[73559]: optional Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: usb Mai 07 19:25:36 devstack nova-compute[73559]: pci Mai 07 19:25:36 devstack nova-compute[73559]: scsi Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: virtio-transitional Mai 07 19:25:36 devstack nova-compute[73559]: virtio-non-transitional Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: random Mai 07 19:25:36 devstack nova-compute[73559]: egd Mai 07 19:25:36 devstack nova-compute[73559]: builtin Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: path Mai 07 19:25:36 devstack nova-compute[73559]: handle Mai 07 19:25:36 devstack nova-compute[73559]: virtiofs Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: tpm-tis Mai 07 19:25:36 devstack nova-compute[73559]: tpm-crb Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: passthrough Mai 07 19:25:36 devstack nova-compute[73559]: emulator Mai 07 19:25:36 devstack nova-compute[73559]: external Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 1.2 Mai 07 19:25:36 devstack nova-compute[73559]: 2.0 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: usb Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: pty Mai 07 19:25:36 devstack nova-compute[73559]: unix Mai 07 19:25:36 devstack nova-compute[73559]: spicevmc Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: qemu Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: builtin Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: relaxed Mai 07 19:25:36 devstack nova-compute[73559]: vapic Mai 07 19:25:36 devstack nova-compute[73559]: spinlocks Mai 07 19:25:36 devstack nova-compute[73559]: vpindex Mai 07 19:25:36 devstack nova-compute[73559]: runtime Mai 07 19:25:36 devstack nova-compute[73559]: synic Mai 07 19:25:36 devstack nova-compute[73559]: stimer Mai 07 19:25:36 devstack nova-compute[73559]: reset Mai 07 19:25:36 devstack nova-compute[73559]: vendor_id Mai 07 19:25:36 devstack nova-compute[73559]: frequencies Mai 07 19:25:36 devstack nova-compute[73559]: reenlightenment Mai 07 19:25:36 devstack nova-compute[73559]: tlbflush Mai 07 19:25:36 devstack nova-compute[73559]: ipi Mai 07 19:25:36 devstack nova-compute[73559]: avic Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: {{(pid=73559) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1105}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for m68k via machine types: {None, 'virt'} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for microblaze via machine types: {None} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch microblaze / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-microblaze' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for microblazeel via machine types: {None} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch microblazeel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-microblazeel' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for mips via machine types: {None} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch mips / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-mips' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for mipsel via machine types: {None} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch mipsel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-mipsel' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for mips64 via machine types: {None} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch mips64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-mips64' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for mips64el via machine types: {None} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch mips64el / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-mips64el' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for ppc via machine types: {None} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch ppc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-ppc' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for ppc64 via machine types: {None, 'pseries', 'powernv'} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for ppc64le via machine types: {'pseries', 'powernv'} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for riscv32 via machine types: {None} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch riscv32 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-riscv32' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for riscv64 via machine types: {None} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch riscv64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-riscv64' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for s390x via machine types: {'s390-ccw-virtio'} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch s390x / virt_type kvm / machine_type s390-ccw-virtio: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-s390x' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for sh4 via machine types: {None} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch sh4 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-sh4' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for sh4eb via machine types: {None} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch sh4eb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-sh4eb' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for sparc via machine types: {None} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch sparc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-sparc' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for sparc64 via machine types: {None} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch sparc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-sparc64' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc', 'ubuntu-q35', 'ubuntu'} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-x86_64 Mai 07 19:25:36 devstack nova-compute[73559]: kvm Mai 07 19:25:36 devstack nova-compute[73559]: pc-q35-noble Mai 07 19:25:36 devstack nova-compute[73559]: x86_64 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: efi Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Mai 07 19:25:36 devstack nova-compute[73559]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Mai 07 19:25:36 devstack nova-compute[73559]: /usr/share/ovmf/OVMF.amdsev.fd Mai 07 19:25:36 devstack nova-compute[73559]: /usr/share/OVMF/OVMF_CODE_4M.fd Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: rom Mai 07 19:25:36 devstack nova-compute[73559]: pflash Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: yes Mai 07 19:25:36 devstack nova-compute[73559]: no Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: yes Mai 07 19:25:36 devstack nova-compute[73559]: no Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: on Mai 07 19:25:36 devstack nova-compute[73559]: off Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: on Mai 07 19:25:36 devstack nova-compute[73559]: off Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-IBPB Mai 07 19:25:36 devstack nova-compute[73559]: AMD Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: qemu64 Mai 07 19:25:36 devstack nova-compute[73559]: qemu32 Mai 07 19:25:36 devstack nova-compute[73559]: phenom Mai 07 19:25:36 devstack nova-compute[73559]: pentium3 Mai 07 19:25:36 devstack nova-compute[73559]: pentium2 Mai 07 19:25:36 devstack nova-compute[73559]: pentium Mai 07 19:25:36 devstack nova-compute[73559]: n270 Mai 07 19:25:36 devstack nova-compute[73559]: kvm64 Mai 07 19:25:36 devstack nova-compute[73559]: kvm32 Mai 07 19:25:36 devstack nova-compute[73559]: coreduo Mai 07 19:25:36 devstack nova-compute[73559]: core2duo Mai 07 19:25:36 devstack nova-compute[73559]: athlon Mai 07 19:25:36 devstack nova-compute[73559]: Westmere-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Westmere Mai 07 19:25:36 devstack nova-compute[73559]: Snowridge Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Server-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Server-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Server Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Client-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Client-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Skylake-Client Mai 07 19:25:36 devstack nova-compute[73559]: SapphireRapids Mai 07 19:25:36 devstack nova-compute[73559]: SandyBridge-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: SandyBridge Mai 07 19:25:36 devstack nova-compute[73559]: Penryn Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G5 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G4 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G3 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G2 Mai 07 19:25:36 devstack nova-compute[73559]: Opteron_G1 Mai 07 19:25:36 devstack nova-compute[73559]: Nehalem-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Nehalem Mai 07 19:25:36 devstack nova-compute[73559]: IvyBridge-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: IvyBridge Mai 07 19:25:36 devstack nova-compute[73559]: Icelake-Server-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Icelake-Server Mai 07 19:25:36 devstack nova-compute[73559]: Haswell-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Haswell-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Haswell-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Haswell Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-Rome Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-Milan Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-IBPB Mai 07 19:25:36 devstack nova-compute[73559]: EPYC-Genoa Mai 07 19:25:36 devstack nova-compute[73559]: EPYC Mai 07 19:25:36 devstack nova-compute[73559]: Dhyana Mai 07 19:25:36 devstack nova-compute[73559]: Cooperlake Mai 07 19:25:36 devstack nova-compute[73559]: Conroe Mai 07 19:25:36 devstack nova-compute[73559]: Cascadelake-Server-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Cascadelake-Server Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell-noTSX-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell-noTSX Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell-IBRS Mai 07 19:25:36 devstack nova-compute[73559]: Broadwell Mai 07 19:25:36 devstack nova-compute[73559]: 486 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: file Mai 07 19:25:36 devstack nova-compute[73559]: anonymous Mai 07 19:25:36 devstack nova-compute[73559]: memfd Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: disk Mai 07 19:25:36 devstack nova-compute[73559]: cdrom Mai 07 19:25:36 devstack nova-compute[73559]: floppy Mai 07 19:25:36 devstack nova-compute[73559]: lun Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: fdc Mai 07 19:25:36 devstack nova-compute[73559]: scsi Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: usb Mai 07 19:25:36 devstack nova-compute[73559]: sata Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: virtio-transitional Mai 07 19:25:36 devstack nova-compute[73559]: virtio-non-transitional Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: sdl Mai 07 19:25:36 devstack nova-compute[73559]: vnc Mai 07 19:25:36 devstack nova-compute[73559]: spice Mai 07 19:25:36 devstack nova-compute[73559]: egl-headless Mai 07 19:25:36 devstack nova-compute[73559]: dbus Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: subsystem Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: default Mai 07 19:25:36 devstack nova-compute[73559]: mandatory Mai 07 19:25:36 devstack nova-compute[73559]: requisite Mai 07 19:25:36 devstack nova-compute[73559]: optional Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: usb Mai 07 19:25:36 devstack nova-compute[73559]: pci Mai 07 19:25:36 devstack nova-compute[73559]: scsi Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: virtio-transitional Mai 07 19:25:36 devstack nova-compute[73559]: virtio-non-transitional Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: random Mai 07 19:25:36 devstack nova-compute[73559]: egd Mai 07 19:25:36 devstack nova-compute[73559]: builtin Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: path Mai 07 19:25:36 devstack nova-compute[73559]: handle Mai 07 19:25:36 devstack nova-compute[73559]: virtiofs Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: tpm-tis Mai 07 19:25:36 devstack nova-compute[73559]: tpm-crb Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: passthrough Mai 07 19:25:36 devstack nova-compute[73559]: emulator Mai 07 19:25:36 devstack nova-compute[73559]: external Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: 1.2 Mai 07 19:25:36 devstack nova-compute[73559]: 2.0 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: usb Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: pty Mai 07 19:25:36 devstack nova-compute[73559]: unix Mai 07 19:25:36 devstack nova-compute[73559]: spicevmc Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: virtio Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: qemu Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: builtin Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: relaxed Mai 07 19:25:36 devstack nova-compute[73559]: vapic Mai 07 19:25:36 devstack nova-compute[73559]: spinlocks Mai 07 19:25:36 devstack nova-compute[73559]: vpindex Mai 07 19:25:36 devstack nova-compute[73559]: runtime Mai 07 19:25:36 devstack nova-compute[73559]: synic Mai 07 19:25:36 devstack nova-compute[73559]: stimer Mai 07 19:25:36 devstack nova-compute[73559]: reset Mai 07 19:25:36 devstack nova-compute[73559]: vendor_id Mai 07 19:25:36 devstack nova-compute[73559]: frequencies Mai 07 19:25:36 devstack nova-compute[73559]: reenlightenment Mai 07 19:25:36 devstack nova-compute[73559]: tlbflush Mai 07 19:25:36 devstack nova-compute[73559]: ipi Mai 07 19:25:36 devstack nova-compute[73559]: avic Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: {{(pid=73559) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1105}} Mai 07 19:25:36 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: /usr/bin/qemu-system-x86_64 Mai 07 19:25:36 devstack nova-compute[73559]: kvm Mai 07 19:25:36 devstack nova-compute[73559]: pc-i440fx-noble Mai 07 19:25:36 devstack nova-compute[73559]: x86_64 Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: efi Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: /usr/share/OVMF/OVMF_CODE_4M.fd Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: rom Mai 07 19:25:36 devstack nova-compute[73559]: pflash Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: yes Mai 07 19:25:36 devstack nova-compute[73559]: no Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: no Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:36 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: on Mai 07 19:25:37 devstack nova-compute[73559]: off Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: on Mai 07 19:25:37 devstack nova-compute[73559]: off Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: EPYC-IBPB Mai 07 19:25:37 devstack nova-compute[73559]: AMD Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: qemu64 Mai 07 19:25:37 devstack nova-compute[73559]: qemu32 Mai 07 19:25:37 devstack nova-compute[73559]: phenom Mai 07 19:25:37 devstack nova-compute[73559]: pentium3 Mai 07 19:25:37 devstack nova-compute[73559]: pentium2 Mai 07 19:25:37 devstack nova-compute[73559]: pentium Mai 07 19:25:37 devstack nova-compute[73559]: n270 Mai 07 19:25:37 devstack nova-compute[73559]: kvm64 Mai 07 19:25:37 devstack nova-compute[73559]: kvm32 Mai 07 19:25:37 devstack nova-compute[73559]: coreduo Mai 07 19:25:37 devstack nova-compute[73559]: core2duo Mai 07 19:25:37 devstack nova-compute[73559]: athlon Mai 07 19:25:37 devstack nova-compute[73559]: Westmere-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Westmere Mai 07 19:25:37 devstack nova-compute[73559]: Snowridge Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Server-noTSX-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Server-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Server Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Client-noTSX-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Client-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Client Mai 07 19:25:37 devstack nova-compute[73559]: SapphireRapids Mai 07 19:25:37 devstack nova-compute[73559]: SandyBridge-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: SandyBridge Mai 07 19:25:37 devstack nova-compute[73559]: Penryn Mai 07 19:25:37 devstack nova-compute[73559]: Opteron_G5 Mai 07 19:25:37 devstack nova-compute[73559]: Opteron_G4 Mai 07 19:25:37 devstack nova-compute[73559]: Opteron_G3 Mai 07 19:25:37 devstack nova-compute[73559]: Opteron_G2 Mai 07 19:25:37 devstack nova-compute[73559]: Opteron_G1 Mai 07 19:25:37 devstack nova-compute[73559]: Nehalem-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Nehalem Mai 07 19:25:37 devstack nova-compute[73559]: IvyBridge-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: IvyBridge Mai 07 19:25:37 devstack nova-compute[73559]: Icelake-Server-noTSX Mai 07 19:25:37 devstack nova-compute[73559]: Icelake-Server Mai 07 19:25:37 devstack nova-compute[73559]: Haswell-noTSX-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Haswell-noTSX Mai 07 19:25:37 devstack nova-compute[73559]: Haswell-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Haswell Mai 07 19:25:37 devstack nova-compute[73559]: EPYC-Rome Mai 07 19:25:37 devstack nova-compute[73559]: EPYC-Milan Mai 07 19:25:37 devstack nova-compute[73559]: EPYC-IBPB Mai 07 19:25:37 devstack nova-compute[73559]: EPYC-Genoa Mai 07 19:25:37 devstack nova-compute[73559]: EPYC Mai 07 19:25:37 devstack nova-compute[73559]: Dhyana Mai 07 19:25:37 devstack nova-compute[73559]: Cooperlake Mai 07 19:25:37 devstack nova-compute[73559]: Conroe Mai 07 19:25:37 devstack nova-compute[73559]: Cascadelake-Server-noTSX Mai 07 19:25:37 devstack nova-compute[73559]: Cascadelake-Server Mai 07 19:25:37 devstack nova-compute[73559]: Broadwell-noTSX-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Broadwell-noTSX Mai 07 19:25:37 devstack nova-compute[73559]: Broadwell-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Broadwell Mai 07 19:25:37 devstack nova-compute[73559]: 486 Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: file Mai 07 19:25:37 devstack nova-compute[73559]: anonymous Mai 07 19:25:37 devstack nova-compute[73559]: memfd Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: disk Mai 07 19:25:37 devstack nova-compute[73559]: cdrom Mai 07 19:25:37 devstack nova-compute[73559]: floppy Mai 07 19:25:37 devstack nova-compute[73559]: lun Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: ide Mai 07 19:25:37 devstack nova-compute[73559]: fdc Mai 07 19:25:37 devstack nova-compute[73559]: scsi Mai 07 19:25:37 devstack nova-compute[73559]: virtio Mai 07 19:25:37 devstack nova-compute[73559]: usb Mai 07 19:25:37 devstack nova-compute[73559]: sata Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: virtio Mai 07 19:25:37 devstack nova-compute[73559]: virtio-transitional Mai 07 19:25:37 devstack nova-compute[73559]: virtio-non-transitional Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: sdl Mai 07 19:25:37 devstack nova-compute[73559]: vnc Mai 07 19:25:37 devstack nova-compute[73559]: spice Mai 07 19:25:37 devstack nova-compute[73559]: egl-headless Mai 07 19:25:37 devstack nova-compute[73559]: dbus Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: subsystem Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: default Mai 07 19:25:37 devstack nova-compute[73559]: mandatory Mai 07 19:25:37 devstack nova-compute[73559]: requisite Mai 07 19:25:37 devstack nova-compute[73559]: optional Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: usb Mai 07 19:25:37 devstack nova-compute[73559]: pci Mai 07 19:25:37 devstack nova-compute[73559]: scsi Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: virtio Mai 07 19:25:37 devstack nova-compute[73559]: virtio-transitional Mai 07 19:25:37 devstack nova-compute[73559]: virtio-non-transitional Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: random Mai 07 19:25:37 devstack nova-compute[73559]: egd Mai 07 19:25:37 devstack nova-compute[73559]: builtin Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: path Mai 07 19:25:37 devstack nova-compute[73559]: handle Mai 07 19:25:37 devstack nova-compute[73559]: virtiofs Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: tpm-tis Mai 07 19:25:37 devstack nova-compute[73559]: tpm-crb Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: passthrough Mai 07 19:25:37 devstack nova-compute[73559]: emulator Mai 07 19:25:37 devstack nova-compute[73559]: external Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: 1.2 Mai 07 19:25:37 devstack nova-compute[73559]: 2.0 Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: usb Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: pty Mai 07 19:25:37 devstack nova-compute[73559]: unix Mai 07 19:25:37 devstack nova-compute[73559]: spicevmc Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: virtio Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: qemu Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: builtin Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: relaxed Mai 07 19:25:37 devstack nova-compute[73559]: vapic Mai 07 19:25:37 devstack nova-compute[73559]: spinlocks Mai 07 19:25:37 devstack nova-compute[73559]: vpindex Mai 07 19:25:37 devstack nova-compute[73559]: runtime Mai 07 19:25:37 devstack nova-compute[73559]: synic Mai 07 19:25:37 devstack nova-compute[73559]: stimer Mai 07 19:25:37 devstack nova-compute[73559]: reset Mai 07 19:25:37 devstack nova-compute[73559]: vendor_id Mai 07 19:25:37 devstack nova-compute[73559]: frequencies Mai 07 19:25:37 devstack nova-compute[73559]: reenlightenment Mai 07 19:25:37 devstack nova-compute[73559]: tlbflush Mai 07 19:25:37 devstack nova-compute[73559]: ipi Mai 07 19:25:37 devstack nova-compute[73559]: avic Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: {{(pid=73559) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1105}} Mai 07 19:25:37 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu-q35: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: /usr/bin/qemu-system-x86_64 Mai 07 19:25:37 devstack nova-compute[73559]: kvm Mai 07 19:25:37 devstack nova-compute[73559]: pc-q35-noble-v2 Mai 07 19:25:37 devstack nova-compute[73559]: x86_64 Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: efi Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Mai 07 19:25:37 devstack nova-compute[73559]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Mai 07 19:25:37 devstack nova-compute[73559]: /usr/share/ovmf/OVMF.amdsev.fd Mai 07 19:25:37 devstack nova-compute[73559]: /usr/share/OVMF/OVMF_CODE_4M.fd Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: rom Mai 07 19:25:37 devstack nova-compute[73559]: pflash Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: yes Mai 07 19:25:37 devstack nova-compute[73559]: no Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: yes Mai 07 19:25:37 devstack nova-compute[73559]: no Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: on Mai 07 19:25:37 devstack nova-compute[73559]: off Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: on Mai 07 19:25:37 devstack nova-compute[73559]: off Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: EPYC-IBPB Mai 07 19:25:37 devstack nova-compute[73559]: AMD Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: qemu64 Mai 07 19:25:37 devstack nova-compute[73559]: qemu32 Mai 07 19:25:37 devstack nova-compute[73559]: phenom Mai 07 19:25:37 devstack nova-compute[73559]: pentium3 Mai 07 19:25:37 devstack nova-compute[73559]: pentium2 Mai 07 19:25:37 devstack nova-compute[73559]: pentium Mai 07 19:25:37 devstack nova-compute[73559]: n270 Mai 07 19:25:37 devstack nova-compute[73559]: kvm64 Mai 07 19:25:37 devstack nova-compute[73559]: kvm32 Mai 07 19:25:37 devstack nova-compute[73559]: coreduo Mai 07 19:25:37 devstack nova-compute[73559]: core2duo Mai 07 19:25:37 devstack nova-compute[73559]: athlon Mai 07 19:25:37 devstack nova-compute[73559]: Westmere-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Westmere Mai 07 19:25:37 devstack nova-compute[73559]: Snowridge Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Server-noTSX-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Server-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Server Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Client-noTSX-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Client-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Client Mai 07 19:25:37 devstack nova-compute[73559]: SapphireRapids Mai 07 19:25:37 devstack nova-compute[73559]: SandyBridge-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: SandyBridge Mai 07 19:25:37 devstack nova-compute[73559]: Penryn Mai 07 19:25:37 devstack nova-compute[73559]: Opteron_G5 Mai 07 19:25:37 devstack nova-compute[73559]: Opteron_G4 Mai 07 19:25:37 devstack nova-compute[73559]: Opteron_G3 Mai 07 19:25:37 devstack nova-compute[73559]: Opteron_G2 Mai 07 19:25:37 devstack nova-compute[73559]: Opteron_G1 Mai 07 19:25:37 devstack nova-compute[73559]: Nehalem-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Nehalem Mai 07 19:25:37 devstack nova-compute[73559]: IvyBridge-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: IvyBridge Mai 07 19:25:37 devstack nova-compute[73559]: Icelake-Server-noTSX Mai 07 19:25:37 devstack nova-compute[73559]: Icelake-Server Mai 07 19:25:37 devstack nova-compute[73559]: Haswell-noTSX-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Haswell-noTSX Mai 07 19:25:37 devstack nova-compute[73559]: Haswell-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Haswell Mai 07 19:25:37 devstack nova-compute[73559]: EPYC-Rome Mai 07 19:25:37 devstack nova-compute[73559]: EPYC-Milan Mai 07 19:25:37 devstack nova-compute[73559]: EPYC-IBPB Mai 07 19:25:37 devstack nova-compute[73559]: EPYC-Genoa Mai 07 19:25:37 devstack nova-compute[73559]: EPYC Mai 07 19:25:37 devstack nova-compute[73559]: Dhyana Mai 07 19:25:37 devstack nova-compute[73559]: Cooperlake Mai 07 19:25:37 devstack nova-compute[73559]: Conroe Mai 07 19:25:37 devstack nova-compute[73559]: Cascadelake-Server-noTSX Mai 07 19:25:37 devstack nova-compute[73559]: Cascadelake-Server Mai 07 19:25:37 devstack nova-compute[73559]: Broadwell-noTSX-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Broadwell-noTSX Mai 07 19:25:37 devstack nova-compute[73559]: Broadwell-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Broadwell Mai 07 19:25:37 devstack nova-compute[73559]: 486 Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: file Mai 07 19:25:37 devstack nova-compute[73559]: anonymous Mai 07 19:25:37 devstack nova-compute[73559]: memfd Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: disk Mai 07 19:25:37 devstack nova-compute[73559]: cdrom Mai 07 19:25:37 devstack nova-compute[73559]: floppy Mai 07 19:25:37 devstack nova-compute[73559]: lun Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: fdc Mai 07 19:25:37 devstack nova-compute[73559]: scsi Mai 07 19:25:37 devstack nova-compute[73559]: virtio Mai 07 19:25:37 devstack nova-compute[73559]: usb Mai 07 19:25:37 devstack nova-compute[73559]: sata Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: virtio Mai 07 19:25:37 devstack nova-compute[73559]: virtio-transitional Mai 07 19:25:37 devstack nova-compute[73559]: virtio-non-transitional Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: sdl Mai 07 19:25:37 devstack nova-compute[73559]: vnc Mai 07 19:25:37 devstack nova-compute[73559]: spice Mai 07 19:25:37 devstack nova-compute[73559]: egl-headless Mai 07 19:25:37 devstack nova-compute[73559]: dbus Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: subsystem Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: default Mai 07 19:25:37 devstack nova-compute[73559]: mandatory Mai 07 19:25:37 devstack nova-compute[73559]: requisite Mai 07 19:25:37 devstack nova-compute[73559]: optional Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: usb Mai 07 19:25:37 devstack nova-compute[73559]: pci Mai 07 19:25:37 devstack nova-compute[73559]: scsi Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: virtio Mai 07 19:25:37 devstack nova-compute[73559]: virtio-transitional Mai 07 19:25:37 devstack nova-compute[73559]: virtio-non-transitional Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: random Mai 07 19:25:37 devstack nova-compute[73559]: egd Mai 07 19:25:37 devstack nova-compute[73559]: builtin Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: path Mai 07 19:25:37 devstack nova-compute[73559]: handle Mai 07 19:25:37 devstack nova-compute[73559]: virtiofs Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: tpm-tis Mai 07 19:25:37 devstack nova-compute[73559]: tpm-crb Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: passthrough Mai 07 19:25:37 devstack nova-compute[73559]: emulator Mai 07 19:25:37 devstack nova-compute[73559]: external Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: 1.2 Mai 07 19:25:37 devstack nova-compute[73559]: 2.0 Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: usb Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: pty Mai 07 19:25:37 devstack nova-compute[73559]: unix Mai 07 19:25:37 devstack nova-compute[73559]: spicevmc Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: virtio Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: qemu Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: builtin Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: relaxed Mai 07 19:25:37 devstack nova-compute[73559]: vapic Mai 07 19:25:37 devstack nova-compute[73559]: spinlocks Mai 07 19:25:37 devstack nova-compute[73559]: vpindex Mai 07 19:25:37 devstack nova-compute[73559]: runtime Mai 07 19:25:37 devstack nova-compute[73559]: synic Mai 07 19:25:37 devstack nova-compute[73559]: stimer Mai 07 19:25:37 devstack nova-compute[73559]: reset Mai 07 19:25:37 devstack nova-compute[73559]: vendor_id Mai 07 19:25:37 devstack nova-compute[73559]: frequencies Mai 07 19:25:37 devstack nova-compute[73559]: reenlightenment Mai 07 19:25:37 devstack nova-compute[73559]: tlbflush Mai 07 19:25:37 devstack nova-compute[73559]: ipi Mai 07 19:25:37 devstack nova-compute[73559]: avic Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: {{(pid=73559) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1105}} Mai 07 19:25:37 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: /usr/bin/qemu-system-x86_64 Mai 07 19:25:37 devstack nova-compute[73559]: kvm Mai 07 19:25:37 devstack nova-compute[73559]: pc-i440fx-noble-v2 Mai 07 19:25:37 devstack nova-compute[73559]: x86_64 Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: efi Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: /usr/share/OVMF/OVMF_CODE_4M.fd Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: rom Mai 07 19:25:37 devstack nova-compute[73559]: pflash Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: yes Mai 07 19:25:37 devstack nova-compute[73559]: no Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: no Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: on Mai 07 19:25:37 devstack nova-compute[73559]: off Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: on Mai 07 19:25:37 devstack nova-compute[73559]: off Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: EPYC-IBPB Mai 07 19:25:37 devstack nova-compute[73559]: AMD Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: qemu64 Mai 07 19:25:37 devstack nova-compute[73559]: qemu32 Mai 07 19:25:37 devstack nova-compute[73559]: phenom Mai 07 19:25:37 devstack nova-compute[73559]: pentium3 Mai 07 19:25:37 devstack nova-compute[73559]: pentium2 Mai 07 19:25:37 devstack nova-compute[73559]: pentium Mai 07 19:25:37 devstack nova-compute[73559]: n270 Mai 07 19:25:37 devstack nova-compute[73559]: kvm64 Mai 07 19:25:37 devstack nova-compute[73559]: kvm32 Mai 07 19:25:37 devstack nova-compute[73559]: coreduo Mai 07 19:25:37 devstack nova-compute[73559]: core2duo Mai 07 19:25:37 devstack nova-compute[73559]: athlon Mai 07 19:25:37 devstack nova-compute[73559]: Westmere-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Westmere Mai 07 19:25:37 devstack nova-compute[73559]: Snowridge Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Server-noTSX-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Server-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Server Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Client-noTSX-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Client-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Skylake-Client Mai 07 19:25:37 devstack nova-compute[73559]: SapphireRapids Mai 07 19:25:37 devstack nova-compute[73559]: SandyBridge-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: SandyBridge Mai 07 19:25:37 devstack nova-compute[73559]: Penryn Mai 07 19:25:37 devstack nova-compute[73559]: Opteron_G5 Mai 07 19:25:37 devstack nova-compute[73559]: Opteron_G4 Mai 07 19:25:37 devstack nova-compute[73559]: Opteron_G3 Mai 07 19:25:37 devstack nova-compute[73559]: Opteron_G2 Mai 07 19:25:37 devstack nova-compute[73559]: Opteron_G1 Mai 07 19:25:37 devstack nova-compute[73559]: Nehalem-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Nehalem Mai 07 19:25:37 devstack nova-compute[73559]: IvyBridge-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: IvyBridge Mai 07 19:25:37 devstack nova-compute[73559]: Icelake-Server-noTSX Mai 07 19:25:37 devstack nova-compute[73559]: Icelake-Server Mai 07 19:25:37 devstack nova-compute[73559]: Haswell-noTSX-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Haswell-noTSX Mai 07 19:25:37 devstack nova-compute[73559]: Haswell-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Haswell Mai 07 19:25:37 devstack nova-compute[73559]: EPYC-Rome Mai 07 19:25:37 devstack nova-compute[73559]: EPYC-Milan Mai 07 19:25:37 devstack nova-compute[73559]: EPYC-IBPB Mai 07 19:25:37 devstack nova-compute[73559]: EPYC-Genoa Mai 07 19:25:37 devstack nova-compute[73559]: EPYC Mai 07 19:25:37 devstack nova-compute[73559]: Dhyana Mai 07 19:25:37 devstack nova-compute[73559]: Cooperlake Mai 07 19:25:37 devstack nova-compute[73559]: Conroe Mai 07 19:25:37 devstack nova-compute[73559]: Cascadelake-Server-noTSX Mai 07 19:25:37 devstack nova-compute[73559]: Cascadelake-Server Mai 07 19:25:37 devstack nova-compute[73559]: Broadwell-noTSX-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Broadwell-noTSX Mai 07 19:25:37 devstack nova-compute[73559]: Broadwell-IBRS Mai 07 19:25:37 devstack nova-compute[73559]: Broadwell Mai 07 19:25:37 devstack nova-compute[73559]: 486 Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: file Mai 07 19:25:37 devstack nova-compute[73559]: anonymous Mai 07 19:25:37 devstack nova-compute[73559]: memfd Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: disk Mai 07 19:25:37 devstack nova-compute[73559]: cdrom Mai 07 19:25:37 devstack nova-compute[73559]: floppy Mai 07 19:25:37 devstack nova-compute[73559]: lun Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: ide Mai 07 19:25:37 devstack nova-compute[73559]: fdc Mai 07 19:25:37 devstack nova-compute[73559]: scsi Mai 07 19:25:37 devstack nova-compute[73559]: virtio Mai 07 19:25:37 devstack nova-compute[73559]: usb Mai 07 19:25:37 devstack nova-compute[73559]: sata Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: virtio Mai 07 19:25:37 devstack nova-compute[73559]: virtio-transitional Mai 07 19:25:37 devstack nova-compute[73559]: virtio-non-transitional Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: sdl Mai 07 19:25:37 devstack nova-compute[73559]: vnc Mai 07 19:25:37 devstack nova-compute[73559]: spice Mai 07 19:25:37 devstack nova-compute[73559]: egl-headless Mai 07 19:25:37 devstack nova-compute[73559]: dbus Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: subsystem Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: default Mai 07 19:25:37 devstack nova-compute[73559]: mandatory Mai 07 19:25:37 devstack nova-compute[73559]: requisite Mai 07 19:25:37 devstack nova-compute[73559]: optional Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: usb Mai 07 19:25:37 devstack nova-compute[73559]: pci Mai 07 19:25:37 devstack nova-compute[73559]: scsi Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: virtio Mai 07 19:25:37 devstack nova-compute[73559]: virtio-transitional Mai 07 19:25:37 devstack nova-compute[73559]: virtio-non-transitional Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: random Mai 07 19:25:37 devstack nova-compute[73559]: egd Mai 07 19:25:37 devstack nova-compute[73559]: builtin Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: path Mai 07 19:25:37 devstack nova-compute[73559]: handle Mai 07 19:25:37 devstack nova-compute[73559]: virtiofs Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: tpm-tis Mai 07 19:25:37 devstack nova-compute[73559]: tpm-crb Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: passthrough Mai 07 19:25:37 devstack nova-compute[73559]: emulator Mai 07 19:25:37 devstack nova-compute[73559]: external Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: 1.2 Mai 07 19:25:37 devstack nova-compute[73559]: 2.0 Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: usb Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: pty Mai 07 19:25:37 devstack nova-compute[73559]: unix Mai 07 19:25:37 devstack nova-compute[73559]: spicevmc Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: virtio Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: qemu Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: builtin Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: relaxed Mai 07 19:25:37 devstack nova-compute[73559]: vapic Mai 07 19:25:37 devstack nova-compute[73559]: spinlocks Mai 07 19:25:37 devstack nova-compute[73559]: vpindex Mai 07 19:25:37 devstack nova-compute[73559]: runtime Mai 07 19:25:37 devstack nova-compute[73559]: synic Mai 07 19:25:37 devstack nova-compute[73559]: stimer Mai 07 19:25:37 devstack nova-compute[73559]: reset Mai 07 19:25:37 devstack nova-compute[73559]: vendor_id Mai 07 19:25:37 devstack nova-compute[73559]: frequencies Mai 07 19:25:37 devstack nova-compute[73559]: reenlightenment Mai 07 19:25:37 devstack nova-compute[73559]: tlbflush Mai 07 19:25:37 devstack nova-compute[73559]: ipi Mai 07 19:25:37 devstack nova-compute[73559]: avic Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: Mai 07 19:25:37 devstack nova-compute[73559]: {{(pid=73559) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1105}} Mai 07 19:25:37 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for xtensa via machine types: {None} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:37 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch xtensa / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-xtensa' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:37 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Getting domain capabilities for xtensaeb via machine types: {None} {{(pid=73559) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:25:37 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Error from libvirt when retrieving domain capabilities for arch xtensaeb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-xtensaeb' on this host {{(pid=73559) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:25:37 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Checking secure boot support for host arch (x86_64) {{(pid=73559) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1962}} Mai 07 19:25:37 devstack nova-compute[73559]: INFO nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Secure Boot support detected Mai 07 19:25:37 devstack nova-compute[73559]: INFO nova.virt.node [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Generated node identity cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 Mai 07 19:25:37 devstack nova-compute[73559]: INFO nova.virt.node [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Wrote node identity cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 to /opt/stack/data/nova/compute_id Mai 07 19:25:37 devstack nova-compute[73559]: WARNING nova.compute.manager [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Compute nodes ['cdec9dae-2ed4-4fdf-a972-e5c56ba8b944'] for host devstack were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. Mai 07 19:25:38 devstack nova-compute[73559]: INFO nova.compute.manager [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host Mai 07 19:25:39 devstack nova-compute[73559]: WARNING nova.compute.manager [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] No compute node record found for host devstack. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host devstack could not be found. Mai 07 19:25:39 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:25:39 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:25:39 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:25:39 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Auditing locally available compute resources for devstack (node: devstack) {{(pid=73559) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:937}} Mai 07 19:25:39 devstack nova-compute[73559]: WARNING nova.virt.libvirt.driver [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:25:39 devstack nova-compute[73559]: DEBUG oslo_concurrency.processutils [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Running cmd (subprocess): env LANG=C uptime {{(pid=73559) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:25:39 devstack nova-compute[73559]: DEBUG oslo_concurrency.processutils [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] CMD "env LANG=C uptime" returned: 0 in 0.047s {{(pid=73559) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:25:39 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Hypervisor/Node resource view: name=devstack free_ram=6711MB free_disk=15.451988220214844GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1043", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1043", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1111", "vendor_id": "1234", "numa_node": null, "label": "label_1234_1111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_00_0", "address": "0000:02:00.0", "product_id": "000d", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000d", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] {{(pid=73559) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1136}} Mai 07 19:25:39 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:25:39 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:25:40 devstack nova-compute[73559]: WARNING nova.compute.resource_tracker [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] No compute node record for devstack:cdec9dae-2ed4-4fdf-a972-e5c56ba8b944: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 could not be found. Mai 07 19:25:40 devstack nova-compute[73559]: INFO nova.compute.resource_tracker [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Compute node record created for devstack:devstack with uuid: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 Mai 07 19:25:42 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Total usable vcpus: 4, total allocated vcpus: 0 {{(pid=73559) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1159}} Mai 07 19:25:42 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Final resource view: name=devstack phys_ram=11961MB used_ram=512MB phys_disk=25GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:25:39 up 25 min, 1 user, load average: 2.80, 2.55, 1.70\n'} {{(pid=73559) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1168}} Mai 07 19:25:43 devstack nova-compute[73559]: INFO nova.scheduler.client.report [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] [req-97b55fee-3f2f-4faa-9e2a-c4163c33560a] Created resource provider record via placement API for resource provider with UUID cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 and name devstack. Mai 07 19:25:43 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] /sys/module/kvm_amd/parameters/sev contains [N Mai 07 19:25:43 devstack nova-compute[73559]: ] {{(pid=73559) _kernel_supports_amd_sev /opt/stack/nova/nova/virt/libvirt/host.py:2038}} Mai 07 19:25:43 devstack nova-compute[73559]: INFO nova.virt.libvirt.host [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] kernel doesn't support AMD SEV Mai 07 19:25:43 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.driver [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=73559) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:25:43 devstack nova-compute[73559]: DEBUG nova.compute.provider_tree [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Updating inventory in ProviderTree for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 with inventory: {'MEMORY_MB': {'total': 11961, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 25, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=73559) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Mai 07 19:25:43 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.driver [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=73559) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:25:43 devstack nova-compute[73559]: DEBUG nova.scheduler.client.report [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Updated inventory for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 11961, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 25, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=73559) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:975}} Mai 07 19:25:43 devstack nova-compute[73559]: DEBUG nova.compute.provider_tree [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Updating resource provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 generation from 0 to 1 during operation: update_inventory {{(pid=73559) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Mai 07 19:25:43 devstack nova-compute[73559]: DEBUG nova.compute.provider_tree [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Updating inventory in ProviderTree for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 with inventory: {'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=73559) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Mai 07 19:25:44 devstack nova-compute[73559]: DEBUG nova.compute.provider_tree [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Updating resource provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 generation from 1 to 2 during operation: update_traits {{(pid=73559) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Mai 07 19:25:44 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Compute_service record updated for devstack:devstack {{(pid=73559) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1097}} Mai 07 19:25:44 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.805s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:25:44 devstack nova-compute[73559]: DEBUG nova.service [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Creating RPC server for service: nova-compute on topic: compute {{(pid=73559) start /opt/stack/nova/nova/service.py:195}} Mai 07 19:25:44 devstack nova-compute[73559]: DEBUG nova.service [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Creating 2nd RPC server for service: nova-compute on topic: compute-alt {{(pid=73559) start /opt/stack/nova/nova/service.py:211}} Mai 07 19:25:44 devstack nova-compute[73559]: DEBUG nova.service [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Join ServiceGroup membership for this service compute {{(pid=73559) start /opt/stack/nova/nova/service.py:221}} Mai 07 19:25:44 devstack nova-compute[73559]: DEBUG nova.servicegroup.drivers.db [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] DB_Driver: join new ServiceGroup member devstack to the compute group, service = {{(pid=73559) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} Mai 07 19:25:50 devstack nova-compute[73559]: DEBUG oslo.service.backend._threading.loopingcall [-] Fixed interval looping call 'nova.servicegroup.drivers.db.DbDriver._report_state' sleeping for 119.49 seconds {{(pid=73559) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:26:09 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._sync_power_states {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:26:10 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:26:10 devstack nova-compute[73559]: DEBUG oslo.service.backend._threading.loopingcall [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 8.66 seconds {{(pid=73559) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG nova.compute.manager [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=73559) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:11402}} Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager.update_available_resource {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Auditing locally available compute resources for devstack (node: devstack) {{(pid=73559) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:937}} Mai 07 19:26:19 devstack nova-compute[73559]: WARNING nova.virt.libvirt.driver [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG oslo_concurrency.processutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running cmd (subprocess): env LANG=C uptime {{(pid=73559) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG oslo_concurrency.processutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] CMD "env LANG=C uptime" returned: 0 in 0.022s {{(pid=73559) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Hypervisor/Node resource view: name=devstack free_ram=6475MB free_disk=15.409767150878906GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1043", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1043", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1111", "vendor_id": "1234", "numa_node": null, "label": "label_1234_1111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_00_0", "address": "0000:02:00.0", "product_id": "000d", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000d", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] {{(pid=73559) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1136}} Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:26:19 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:26:20 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Total usable vcpus: 4, total allocated vcpus: 0 {{(pid=73559) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1159}} Mai 07 19:26:20 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Final resource view: name=devstack phys_ram=11961MB used_ram=512MB phys_disk=25GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:26:19 up 26 min, 1 user, load average: 2.56, 2.54, 1.73\n'} {{(pid=73559) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1168}} Mai 07 19:26:20 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.driver [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=73559) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:26:20 devstack nova-compute[73559]: DEBUG nova.compute.provider_tree [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=73559) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:26:21 devstack nova-compute[73559]: DEBUG nova.scheduler.client.report [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=73559) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:26:21 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Compute_service record updated for devstack:devstack {{(pid=73559) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1097}} Mai 07 19:26:21 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.095s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:26:21 devstack nova-compute[73559]: DEBUG oslo.service.backend._threading.loopingcall [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 59.99 seconds {{(pid=73559) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:27:21 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:27:21 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:27:21 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:27:21 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:27:21 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:27:21 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:27:21 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:27:21 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:27:21 devstack nova-compute[73559]: DEBUG nova.compute.manager [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=73559) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:11402}} Mai 07 19:27:21 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager.update_available_resource {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:27:22 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:27:22 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:27:22 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:27:22 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Auditing locally available compute resources for devstack (node: devstack) {{(pid=73559) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:937}} Mai 07 19:27:22 devstack nova-compute[73559]: WARNING nova.virt.libvirt.driver [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:27:22 devstack nova-compute[73559]: DEBUG oslo_concurrency.processutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running cmd (subprocess): env LANG=C uptime {{(pid=73559) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:27:22 devstack nova-compute[73559]: DEBUG oslo_concurrency.processutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] CMD "env LANG=C uptime" returned: 0 in 0.020s {{(pid=73559) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:27:22 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Hypervisor/Node resource view: name=devstack free_ram=6436MB free_disk=15.432823181152344GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1043", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1043", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1111", "vendor_id": "1234", "numa_node": null, "label": "label_1234_1111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_00_0", "address": "0000:02:00.0", "product_id": "000d", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000d", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] {{(pid=73559) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1136}} Mai 07 19:27:22 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:27:22 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:27:23 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Total usable vcpus: 4, total allocated vcpus: 0 {{(pid=73559) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1159}} Mai 07 19:27:23 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Final resource view: name=devstack phys_ram=11961MB used_ram=512MB phys_disk=25GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:27:22 up 27 min, 1 user, load average: 1.80, 2.31, 1.70\n'} {{(pid=73559) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1168}} Mai 07 19:27:23 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.driver [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=73559) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:27:23 devstack nova-compute[73559]: DEBUG nova.compute.provider_tree [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=73559) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:27:23 devstack nova-compute[73559]: DEBUG nova.scheduler.client.report [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=73559) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:27:24 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Compute_service record updated for devstack:devstack {{(pid=73559) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1097}} Mai 07 19:27:24 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.089s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:27:24 devstack nova-compute[73559]: DEBUG oslo.service.backend._threading.loopingcall [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 60.00 seconds {{(pid=73559) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:27:50 devstack nova-compute[73559]: DEBUG oslo.service.backend._threading.loopingcall [-] Fixed interval looping call 'nova.servicegroup.drivers.db.DbDriver._report_state' sleeping for 119.50 seconds {{(pid=73559) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:28:24 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:28:24 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:28:24 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:28:24 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:28:24 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:28:24 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:28:24 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:28:24 devstack nova-compute[73559]: DEBUG nova.compute.manager [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=73559) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:11402}} Mai 07 19:28:24 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager.update_available_resource {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:28:24 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:28:24 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:28:24 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:28:24 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Auditing locally available compute resources for devstack (node: devstack) {{(pid=73559) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:937}} Mai 07 19:28:24 devstack nova-compute[73559]: WARNING nova.virt.libvirt.driver [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:28:24 devstack nova-compute[73559]: DEBUG oslo_concurrency.processutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running cmd (subprocess): env LANG=C uptime {{(pid=73559) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:28:25 devstack nova-compute[73559]: DEBUG oslo_concurrency.processutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] CMD "env LANG=C uptime" returned: 0 in 0.021s {{(pid=73559) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:28:25 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Hypervisor/Node resource view: name=devstack free_ram=6470MB free_disk=15.201045989990234GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1043", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1043", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1111", "vendor_id": "1234", "numa_node": null, "label": "label_1234_1111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_00_0", "address": "0000:02:00.0", "product_id": "000d", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000d", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] {{(pid=73559) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1136}} Mai 07 19:28:25 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:28:25 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:28:26 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Total usable vcpus: 4, total allocated vcpus: 0 {{(pid=73559) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1159}} Mai 07 19:28:26 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Final resource view: name=devstack phys_ram=11961MB used_ram=512MB phys_disk=25GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:28:25 up 28 min, 1 user, load average: 2.33, 2.34, 1.76\n'} {{(pid=73559) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1168}} Mai 07 19:28:26 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.driver [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=73559) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:28:26 devstack nova-compute[73559]: DEBUG nova.compute.provider_tree [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=73559) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:28:26 devstack nova-compute[73559]: DEBUG nova.scheduler.client.report [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=73559) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:28:27 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Compute_service record updated for devstack:devstack {{(pid=73559) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1097}} Mai 07 19:28:27 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.125s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:28:27 devstack nova-compute[73559]: DEBUG oslo.service.backend._threading.loopingcall [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 54.70 seconds {{(pid=73559) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:29:21 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:29:21 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:29:21 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:29:21 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager.update_available_resource {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:29:22 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:29:22 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:29:22 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:29:22 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Auditing locally available compute resources for devstack (node: devstack) {{(pid=73559) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:937}} Mai 07 19:29:22 devstack nova-compute[73559]: WARNING nova.virt.libvirt.driver [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:29:22 devstack nova-compute[73559]: DEBUG oslo_concurrency.processutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running cmd (subprocess): env LANG=C uptime {{(pid=73559) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:29:22 devstack nova-compute[73559]: DEBUG oslo_concurrency.processutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] CMD "env LANG=C uptime" returned: 0 in 0.020s {{(pid=73559) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:29:22 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Hypervisor/Node resource view: name=devstack free_ram=6459MB free_disk=15.076732635498047GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1043", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1043", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1111", "vendor_id": "1234", "numa_node": null, "label": "label_1234_1111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_00_0", "address": "0000:02:00.0", "product_id": "000d", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000d", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] {{(pid=73559) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1136}} Mai 07 19:29:22 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:29:22 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:29:23 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Total usable vcpus: 4, total allocated vcpus: 0 {{(pid=73559) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1159}} Mai 07 19:29:23 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Final resource view: name=devstack phys_ram=11961MB used_ram=512MB phys_disk=25GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:29:22 up 29 min, 1 user, load average: 1.42, 2.08, 1.70\n'} {{(pid=73559) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1168}} Mai 07 19:29:23 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.driver [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=73559) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:29:23 devstack nova-compute[73559]: DEBUG nova.compute.provider_tree [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=73559) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:29:23 devstack nova-compute[73559]: DEBUG nova.scheduler.client.report [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=73559) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:29:24 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Compute_service record updated for devstack:devstack {{(pid=73559) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1097}} Mai 07 19:29:24 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.086s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:29:24 devstack nova-compute[73559]: DEBUG oslo.service.backend._threading.loopingcall [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 1.32 seconds {{(pid=73559) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:29:25 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:29:25 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:29:25 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:29:25 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:29:25 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:29:25 devstack nova-compute[73559]: DEBUG nova.compute.manager [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=73559) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:11402}} Mai 07 19:29:25 devstack nova-compute[73559]: DEBUG oslo.service.backend._threading.loopingcall [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 53.36 seconds {{(pid=73559) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:29:50 devstack nova-compute[73559]: DEBUG oslo.service.backend._threading.loopingcall [-] Fixed interval looping call 'nova.servicegroup.drivers.db.DbDriver._report_state' sleeping for 119.50 seconds {{(pid=73559) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:30:14 devstack nova-compute[73559]: INFO nova.virt.libvirt.driver [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Connection event '0' reason 'Connection to libvirt lost: 1' Mai 07 19:30:15 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.driver [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Updating compute service status to disabled {{(pid=73559) _set_host_enabled /opt/stack/nova/nova/virt/libvirt/driver.py:5735}} Mai 07 19:30:15 devstack nova-compute[73559]: DEBUG nova.objects.service [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Lazy-loading 'compute_node' on Service id 2 {{(pid=73559) obj_load_attr /opt/stack/nova/nova/objects/service.py:431}} Mai 07 19:30:16 devstack nova-compute[73559]: DEBUG nova.compute.manager [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 in placement. {{(pid=73559) update_compute_provider_status /opt/stack/nova/nova/compute/manager.py:634}} Mai 07 19:30:16 devstack nova-compute[73559]: DEBUG nova.compute.provider_tree [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Updating resource provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 generation from 2 to 3 during operation: update_traits {{(pid=73559) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Mai 07 19:30:16 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.volume.mount [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Destroying MountManager generation 0 {{(pid=73559) _host_down /opt/stack/nova/nova/virt/libvirt/volume/mount.py:147}} Mai 07 19:30:19 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:30:19 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:30:19 devstack nova-compute[73559]: DEBUG nova.compute.manager [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Cleaning up deleted instances {{(pid=73559) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:12083}} Mai 07 19:30:19 devstack nova-compute[73559]: DEBUG nova.compute.manager [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] There are 0 instances to clean {{(pid=73559) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:12092}} Mai 07 19:30:19 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:30:19 devstack nova-compute[73559]: DEBUG nova.compute.manager [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Cleaning up deleted instances with incomplete migration {{(pid=73559) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:12121}} Mai 07 19:30:20 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:30:20 devstack nova-compute[73559]: DEBUG oslo.service.backend._threading.loopingcall [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 2.00 seconds {{(pid=73559) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:30:22 devstack nova-compute[73559]: DEBUG oslo_service.periodic_task [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Running periodic task ComputeManager.update_available_resource {{(pid=73559) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:30:22 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Connecting to libvirt: qemu:///system {{(pid=73559) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:558}} Mai 07 19:30:22 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Registering for lifecycle events {{(pid=73559) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:564}} Mai 07 19:30:22 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.host [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Registering for connection events: {{(pid=73559) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:585}} Mai 07 19:30:22 devstack nova-compute[73559]: INFO nova.virt.libvirt.driver [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Connection event '1' reason 'None' Mai 07 19:30:23 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:30:23 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:30:23 devstack nova-compute[73559]: DEBUG oslo_concurrency.lockutils [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=73559) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:30:23 devstack nova-compute[73559]: DEBUG nova.compute.resource_tracker [None req-4b1f4142-580b-4ef7-ac8a-bfa77697b9bb None None] Auditing locally available compute resources for devstack (node: devstack) {{(pid=73559) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:937}} Mai 07 19:30:23 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.driver [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Updating compute service status to enabled {{(pid=73559) _set_host_enabled /opt/stack/nova/nova/virt/libvirt/driver.py:5735}} Mai 07 19:30:23 devstack nova-compute[73559]: DEBUG nova.objects.service [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Lazy-loading 'compute_node' on Service id 2 {{(pid=73559) obj_load_attr /opt/stack/nova/nova/objects/service.py:431}} Mai 07 19:30:24 devstack nova-compute[73559]: DEBUG nova.compute.manager [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 in placement. {{(pid=73559) update_compute_provider_status /opt/stack/nova/nova/compute/manager.py:630}} Mai 07 19:30:24 devstack nova-compute[73559]: DEBUG nova.compute.provider_tree [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Updating resource provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 generation from 3 to 4 during operation: update_traits {{(pid=73559) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Mai 07 19:30:24 devstack nova-compute[73559]: DEBUG nova.virt.libvirt.volume.mount [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Initialising _HostMountState generation 1 {{(pid=73559) host_up /opt/stack/nova/nova/virt/libvirt/volume/mount.py:130}} Mai 07 19:30:32 devstack nova-compute[73559]: INFO oslo_service.backend._threading.service [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Graceful shutdown start Mai 07 19:30:32 devstack nova-compute[73559]: DEBUG nova.service [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] nova-compute service graceful shutdown started. {{(pid=73559) stop /opt/stack/nova/nova/service.py:324}} Mai 07 19:30:32 devstack nova-compute[73559]: DEBUG nova.service [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] nova-compute service stopping RPC server on topic: compute {{(pid=73559) _shutdown_rpc_server /opt/stack/nova/nova/service.py:313}} Mai 07 19:30:34 devstack nova-compute[73559]: WARNING amqp [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Received method (60, 30) during closing channel 1. This method will be ignored Mai 07 19:30:34 devstack nova-compute[73559]: DEBUG nova.service [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] nova-compute service stopped RPC server on topic: compute {{(pid=73559) _shutdown_rpc_server /opt/stack/nova/nova/service.py:317}} Mai 07 19:30:34 devstack nova-compute[73559]: DEBUG nova.service [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] nova-compute manager graceful shutdown started. {{(pid=73559) stop /opt/stack/nova/nova/service.py:332}} Mai 07 19:30:34 devstack nova-compute[73559]: WARNING nova.compute.manager [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] manager_shutdown_timeout (160) is higher than graceful_shutdown_timeout (5); the service may be killed before the manager finishes waiting. Mai 07 19:30:34 devstack nova-compute[73559]: DEBUG nova.compute.manager [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Compute service manager is waiting for 0 seconds to finish in-progress tasks {{(pid=73559) graceful_shutdown /opt/stack/nova/nova/compute/manager.py:1909}} Mai 07 19:30:34 devstack nova-compute[73559]: DEBUG nova.utils [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Sentinel is queued {{(pid=73559) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:30:34 devstack nova-compute[73559]: DEBUG nova.utils [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Shutdown is set {{(pid=73559) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:30:34 devstack nova-compute[73559]: DEBUG nova.utils [-] Received Task(fn=. at 0x7dfdb5a687c0>, remaining_delay=-313.7288886849999 future=) {{(pid=73559) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:30:34 devstack nova-compute[73559]: DEBUG nova.utils [-] Sentinel received, thread is exiting {{(pid=73559) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:30:34 devstack nova-compute[73559]: DEBUG nova.utils [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Queue joined {{(pid=73559) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:30:34 devstack nova-compute[73559]: DEBUG nova.utils [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Scheduler thread joined {{(pid=73559) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:30:34 devstack nova-compute[73559]: DEBUG nova.service [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] nova-compute manager graceful shutdown finished. {{(pid=73559) stop /opt/stack/nova/nova/service.py:335}} Mai 07 19:30:34 devstack nova-compute[73559]: DEBUG nova.service [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] nova-compute service stopping RPC server on topic: compute-alt {{(pid=73559) _shutdown_rpc_server /opt/stack/nova/nova/service.py:313}} Mai 07 19:30:34 devstack nova-compute[73559]: WARNING amqp [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Received method (60, 30) during closing channel 1. This method will be ignored Mai 07 19:30:34 devstack nova-compute[73559]: DEBUG nova.service [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] nova-compute service stopped RPC server on topic: compute-alt {{(pid=73559) _shutdown_rpc_server /opt/stack/nova/nova/service.py:317}} Mai 07 19:30:34 devstack nova-compute[73559]: DEBUG nova.service [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] nova-compute service graceful shutdown finished. {{(pid=73559) stop /opt/stack/nova/nova/service.py:348}} Mai 07 19:30:34 devstack nova-compute[73559]: INFO oslo_service.backend._threading.service [None req-0d045578-8aa1-49f6-932f-4579989fc1e3 None None] Graceful shutdown finish Mai 07 19:30:35 devstack nova-compute[86443]: Service is starting with native threading. This is currently experimental. Do not use it in production without first testing it in pre-production. Mai 07 19:30:39 devstack nova-compute[86443]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=86443) initialize /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:44}} Mai 07 19:30:39 devstack nova-compute[86443]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=86443) initialize /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:44}} Mai 07 19:30:39 devstack nova-compute[86443]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=86443) initialize /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:44}} Mai 07 19:30:39 devstack nova-compute[86443]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs Mai 07 19:30:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:30:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.015s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:30:40 devstack nova-compute[86443]: INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative Mai 07 19:30:40 devstack nova-compute[86443]: WARNING nova.compute.manager [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] In native threading mode the number of concurrent builds, and snapshots should be limited to the same number. The current configuration has differing limits: max_concurrent_builds: 10, max_concurrent_snapshots: 5. Nova will use a single, overall limit of 10 for these tasks. Mai 07 19:30:40 devstack nova-compute[86443]: INFO nova.utils [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] The long task thread pool MainProcess.long_task is initialized Mai 07 19:30:40 devstack nova-compute[86443]: INFO nova.virt.driver [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Loading compute driver 'libvirt.LibvirtDriver' Mai 07 19:30:41 devstack nova-compute[86443]: INFO nova.compute.provider_config [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. Mai 07 19:30:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Acquiring lock "singleton_lock" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:30:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Acquired lock "singleton_lock" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:30:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Releasing lock "singleton_lock" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:30:42 devstack nova-compute[86443]: WARNING oslo_service.backend._threading.service [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] no_fork=True: running service in main process Mai 07 19:30:42 devstack nova-compute[86443]: INFO nova.service [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Starting compute node (version 33.1.0) Mai 07 19:30:42 devstack nova-compute[86443]: INFO nova.utils [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] The default thread pool MainProcess.default is initialized Mai 07 19:30:42 devstack nova-compute[86443]: DEBUG nova.utils [-] Waiting for the next task {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:30:42 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Starting event thread {{(pid=86443) start /opt/stack/nova/nova/virt/libvirt/host.py:252}} Mai 07 19:30:42 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Starting connection event dispatch thread {{(pid=86443) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:553}} Mai 07 19:30:42 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Connecting to libvirt: qemu:///system {{(pid=86443) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:558}} Mai 07 19:30:42 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Registering for lifecycle events {{(pid=86443) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:564}} Mai 07 19:30:42 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Registering for connection events: {{(pid=86443) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:585}} Mai 07 19:30:42 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Connection event '1' reason 'None' Mai 07 19:30:42 devstack nova-compute[86443]: INFO nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Libvirt host capabilities Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 1edef36a-6b3a-4b67-b01c-d6a682c117a8 Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: x86_64 Mai 07 19:30:42 devstack nova-compute[86443]: EPYC-IBPB Mai 07 19:30:42 devstack nova-compute[86443]: AMD Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: tcp Mai 07 19:30:42 devstack nova-compute[86443]: rdma Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 12248276 Mai 07 19:30:42 devstack nova-compute[86443]: 3062069 Mai 07 19:30:42 devstack nova-compute[86443]: 0 Mai 07 19:30:42 devstack nova-compute[86443]: 0 Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: none Mai 07 19:30:42 devstack nova-compute[86443]: 0 Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: dac Mai 07 19:30:42 devstack nova-compute[86443]: 0 Mai 07 19:30:42 devstack nova-compute[86443]: +64055:+994 Mai 07 19:30:42 devstack nova-compute[86443]: +64055:+994 Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 64 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-alpha Mai 07 19:30:42 devstack nova-compute[86443]: clipper Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 32 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-arm Mai 07 19:30:42 devstack nova-compute[86443]: virt-8.2 Mai 07 19:30:42 devstack nova-compute[86443]: virt Mai 07 19:30:42 devstack nova-compute[86443]: qcom-dc-scm-v1-bmc Mai 07 19:30:42 devstack nova-compute[86443]: mori-bmc Mai 07 19:30:42 devstack nova-compute[86443]: ast2600-evb Mai 07 19:30:42 devstack nova-compute[86443]: borzoi Mai 07 19:30:42 devstack nova-compute[86443]: tiogapass-bmc Mai 07 19:30:42 devstack nova-compute[86443]: spitz Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.7 Mai 07 19:30:42 devstack nova-compute[86443]: nuri Mai 07 19:30:42 devstack nova-compute[86443]: mcimx7d-sabre Mai 07 19:30:42 devstack nova-compute[86443]: romulus-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-3.0 Mai 07 19:30:42 devstack nova-compute[86443]: virt-5.0 Mai 07 19:30:42 devstack nova-compute[86443]: npcm750-evb Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.10 Mai 07 19:30:42 devstack nova-compute[86443]: rainier-bmc Mai 07 19:30:42 devstack nova-compute[86443]: mps3-an547 Mai 07 19:30:42 devstack nova-compute[86443]: musca-b1 Mai 07 19:30:42 devstack nova-compute[86443]: realview-pbx-a9 Mai 07 19:30:42 devstack nova-compute[86443]: versatileab Mai 07 19:30:42 devstack nova-compute[86443]: kzm Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.8 Mai 07 19:30:42 devstack nova-compute[86443]: fby35-bmc Mai 07 19:30:42 devstack nova-compute[86443]: musca-a Mai 07 19:30:42 devstack nova-compute[86443]: virt-3.1 Mai 07 19:30:42 devstack nova-compute[86443]: mcimx6ul-evk Mai 07 19:30:42 devstack nova-compute[86443]: virt-5.1 Mai 07 19:30:42 devstack nova-compute[86443]: smdkc210 Mai 07 19:30:42 devstack nova-compute[86443]: sx1 Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.11 Mai 07 19:30:42 devstack nova-compute[86443]: imx25-pdk Mai 07 19:30:42 devstack nova-compute[86443]: stm32vldiscovery Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.9 Mai 07 19:30:42 devstack nova-compute[86443]: orangepi-pc Mai 07 19:30:42 devstack nova-compute[86443]: quanta-q71l-bmc Mai 07 19:30:42 devstack nova-compute[86443]: z2 Mai 07 19:30:42 devstack nova-compute[86443]: virt-5.2 Mai 07 19:30:42 devstack nova-compute[86443]: xilinx-zynq-a9 Mai 07 19:30:42 devstack nova-compute[86443]: tosa Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an500 Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.12 Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an521 Mai 07 19:30:42 devstack nova-compute[86443]: sabrelite Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an511 Mai 07 19:30:42 devstack nova-compute[86443]: canon-a1100 Mai 07 19:30:42 devstack nova-compute[86443]: realview-eb Mai 07 19:30:42 devstack nova-compute[86443]: quanta-gbs-bmc Mai 07 19:30:42 devstack nova-compute[86443]: emcraft-sf2 Mai 07 19:30:42 devstack nova-compute[86443]: realview-pb-a8 Mai 07 19:30:42 devstack nova-compute[86443]: yosemitev2-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-7.0 Mai 07 19:30:42 devstack nova-compute[86443]: virt-4.0 Mai 07 19:30:42 devstack nova-compute[86443]: raspi1ap Mai 07 19:30:42 devstack nova-compute[86443]: palmetto-bmc Mai 07 19:30:42 devstack nova-compute[86443]: sx1-v1 Mai 07 19:30:42 devstack nova-compute[86443]: n810 Mai 07 19:30:42 devstack nova-compute[86443]: g220a-bmc Mai 07 19:30:42 devstack nova-compute[86443]: n800 Mai 07 19:30:42 devstack nova-compute[86443]: bletchley-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-7.1 Mai 07 19:30:42 devstack nova-compute[86443]: tacoma-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-4.1 Mai 07 19:30:42 devstack nova-compute[86443]: quanta-gsj Mai 07 19:30:42 devstack nova-compute[86443]: versatilepb Mai 07 19:30:42 devstack nova-compute[86443]: terrier Mai 07 19:30:42 devstack nova-compute[86443]: mainstone Mai 07 19:30:42 devstack nova-compute[86443]: realview-eb-mpcore Mai 07 19:30:42 devstack nova-compute[86443]: integratorcp Mai 07 19:30:42 devstack nova-compute[86443]: virt-7.2 Mai 07 19:30:42 devstack nova-compute[86443]: supermicrox11-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-4.2 Mai 07 19:30:42 devstack nova-compute[86443]: witherspoon-bmc Mai 07 19:30:42 devstack nova-compute[86443]: qcom-firework-bmc Mai 07 19:30:42 devstack nova-compute[86443]: mps3-an524 Mai 07 19:30:42 devstack nova-compute[86443]: kudo-bmc Mai 07 19:30:42 devstack nova-compute[86443]: vexpress-a9 Mai 07 19:30:42 devstack nova-compute[86443]: midway Mai 07 19:30:42 devstack nova-compute[86443]: musicpal Mai 07 19:30:42 devstack nova-compute[86443]: lm3s811evb Mai 07 19:30:42 devstack nova-compute[86443]: lm3s6965evb Mai 07 19:30:42 devstack nova-compute[86443]: supermicro-x11spi-bmc Mai 07 19:30:42 devstack nova-compute[86443]: microbit Mai 07 19:30:42 devstack nova-compute[86443]: fby35 Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an505 Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an385 Mai 07 19:30:42 devstack nova-compute[86443]: virt-6.0 Mai 07 19:30:42 devstack nova-compute[86443]: virt-8.0 Mai 07 19:30:42 devstack nova-compute[86443]: cubieboard Mai 07 19:30:42 devstack nova-compute[86443]: ast1030-evb Mai 07 19:30:42 devstack nova-compute[86443]: verdex Mai 07 19:30:42 devstack nova-compute[86443]: bpim2u Mai 07 19:30:42 devstack nova-compute[86443]: netduino2 Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an386 Mai 07 19:30:42 devstack nova-compute[86443]: olimex-stm32-h405 Mai 07 19:30:42 devstack nova-compute[86443]: virt-6.1 Mai 07 19:30:42 devstack nova-compute[86443]: virt-8.1 Mai 07 19:30:42 devstack nova-compute[86443]: raspi2b Mai 07 19:30:42 devstack nova-compute[86443]: vexpress-a15 Mai 07 19:30:42 devstack nova-compute[86443]: fuji-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-6.2 Mai 07 19:30:42 devstack nova-compute[86443]: sonorapass-bmc Mai 07 19:30:42 devstack nova-compute[86443]: cheetah Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.6 Mai 07 19:30:42 devstack nova-compute[86443]: ast2500-evb Mai 07 19:30:42 devstack nova-compute[86443]: highbank Mai 07 19:30:42 devstack nova-compute[86443]: akita Mai 07 19:30:42 devstack nova-compute[86443]: connex Mai 07 19:30:42 devstack nova-compute[86443]: netduinoplus2 Mai 07 19:30:42 devstack nova-compute[86443]: collie Mai 07 19:30:42 devstack nova-compute[86443]: raspi0 Mai 07 19:30:42 devstack nova-compute[86443]: fp5280g2-bmc Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 32 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-arm Mai 07 19:30:42 devstack nova-compute[86443]: virt-8.2 Mai 07 19:30:42 devstack nova-compute[86443]: virt Mai 07 19:30:42 devstack nova-compute[86443]: qcom-dc-scm-v1-bmc Mai 07 19:30:42 devstack nova-compute[86443]: mori-bmc Mai 07 19:30:42 devstack nova-compute[86443]: ast2600-evb Mai 07 19:30:42 devstack nova-compute[86443]: borzoi Mai 07 19:30:42 devstack nova-compute[86443]: tiogapass-bmc Mai 07 19:30:42 devstack nova-compute[86443]: spitz Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.7 Mai 07 19:30:42 devstack nova-compute[86443]: nuri Mai 07 19:30:42 devstack nova-compute[86443]: mcimx7d-sabre Mai 07 19:30:42 devstack nova-compute[86443]: romulus-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-3.0 Mai 07 19:30:42 devstack nova-compute[86443]: virt-5.0 Mai 07 19:30:42 devstack nova-compute[86443]: npcm750-evb Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.10 Mai 07 19:30:42 devstack nova-compute[86443]: rainier-bmc Mai 07 19:30:42 devstack nova-compute[86443]: mps3-an547 Mai 07 19:30:42 devstack nova-compute[86443]: musca-b1 Mai 07 19:30:42 devstack nova-compute[86443]: realview-pbx-a9 Mai 07 19:30:42 devstack nova-compute[86443]: versatileab Mai 07 19:30:42 devstack nova-compute[86443]: kzm Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.8 Mai 07 19:30:42 devstack nova-compute[86443]: fby35-bmc Mai 07 19:30:42 devstack nova-compute[86443]: musca-a Mai 07 19:30:42 devstack nova-compute[86443]: virt-3.1 Mai 07 19:30:42 devstack nova-compute[86443]: mcimx6ul-evk Mai 07 19:30:42 devstack nova-compute[86443]: virt-5.1 Mai 07 19:30:42 devstack nova-compute[86443]: smdkc210 Mai 07 19:30:42 devstack nova-compute[86443]: sx1 Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.11 Mai 07 19:30:42 devstack nova-compute[86443]: imx25-pdk Mai 07 19:30:42 devstack nova-compute[86443]: stm32vldiscovery Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.9 Mai 07 19:30:42 devstack nova-compute[86443]: orangepi-pc Mai 07 19:30:42 devstack nova-compute[86443]: quanta-q71l-bmc Mai 07 19:30:42 devstack nova-compute[86443]: z2 Mai 07 19:30:42 devstack nova-compute[86443]: virt-5.2 Mai 07 19:30:42 devstack nova-compute[86443]: xilinx-zynq-a9 Mai 07 19:30:42 devstack nova-compute[86443]: tosa Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an500 Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.12 Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an521 Mai 07 19:30:42 devstack nova-compute[86443]: sabrelite Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an511 Mai 07 19:30:42 devstack nova-compute[86443]: canon-a1100 Mai 07 19:30:42 devstack nova-compute[86443]: realview-eb Mai 07 19:30:42 devstack nova-compute[86443]: quanta-gbs-bmc Mai 07 19:30:42 devstack nova-compute[86443]: emcraft-sf2 Mai 07 19:30:42 devstack nova-compute[86443]: realview-pb-a8 Mai 07 19:30:42 devstack nova-compute[86443]: yosemitev2-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-7.0 Mai 07 19:30:42 devstack nova-compute[86443]: virt-4.0 Mai 07 19:30:42 devstack nova-compute[86443]: raspi1ap Mai 07 19:30:42 devstack nova-compute[86443]: palmetto-bmc Mai 07 19:30:42 devstack nova-compute[86443]: sx1-v1 Mai 07 19:30:42 devstack nova-compute[86443]: n810 Mai 07 19:30:42 devstack nova-compute[86443]: g220a-bmc Mai 07 19:30:42 devstack nova-compute[86443]: n800 Mai 07 19:30:42 devstack nova-compute[86443]: bletchley-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-7.1 Mai 07 19:30:42 devstack nova-compute[86443]: tacoma-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-4.1 Mai 07 19:30:42 devstack nova-compute[86443]: quanta-gsj Mai 07 19:30:42 devstack nova-compute[86443]: versatilepb Mai 07 19:30:42 devstack nova-compute[86443]: terrier Mai 07 19:30:42 devstack nova-compute[86443]: mainstone Mai 07 19:30:42 devstack nova-compute[86443]: realview-eb-mpcore Mai 07 19:30:42 devstack nova-compute[86443]: integratorcp Mai 07 19:30:42 devstack nova-compute[86443]: virt-7.2 Mai 07 19:30:42 devstack nova-compute[86443]: supermicrox11-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-4.2 Mai 07 19:30:42 devstack nova-compute[86443]: witherspoon-bmc Mai 07 19:30:42 devstack nova-compute[86443]: qcom-firework-bmc Mai 07 19:30:42 devstack nova-compute[86443]: mps3-an524 Mai 07 19:30:42 devstack nova-compute[86443]: kudo-bmc Mai 07 19:30:42 devstack nova-compute[86443]: vexpress-a9 Mai 07 19:30:42 devstack nova-compute[86443]: midway Mai 07 19:30:42 devstack nova-compute[86443]: musicpal Mai 07 19:30:42 devstack nova-compute[86443]: lm3s811evb Mai 07 19:30:42 devstack nova-compute[86443]: lm3s6965evb Mai 07 19:30:42 devstack nova-compute[86443]: supermicro-x11spi-bmc Mai 07 19:30:42 devstack nova-compute[86443]: microbit Mai 07 19:30:42 devstack nova-compute[86443]: fby35 Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an505 Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an385 Mai 07 19:30:42 devstack nova-compute[86443]: virt-6.0 Mai 07 19:30:42 devstack nova-compute[86443]: virt-8.0 Mai 07 19:30:42 devstack nova-compute[86443]: cubieboard Mai 07 19:30:42 devstack nova-compute[86443]: ast1030-evb Mai 07 19:30:42 devstack nova-compute[86443]: verdex Mai 07 19:30:42 devstack nova-compute[86443]: bpim2u Mai 07 19:30:42 devstack nova-compute[86443]: netduino2 Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an386 Mai 07 19:30:42 devstack nova-compute[86443]: olimex-stm32-h405 Mai 07 19:30:42 devstack nova-compute[86443]: virt-6.1 Mai 07 19:30:42 devstack nova-compute[86443]: virt-8.1 Mai 07 19:30:42 devstack nova-compute[86443]: raspi2b Mai 07 19:30:42 devstack nova-compute[86443]: vexpress-a15 Mai 07 19:30:42 devstack nova-compute[86443]: fuji-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-6.2 Mai 07 19:30:42 devstack nova-compute[86443]: sonorapass-bmc Mai 07 19:30:42 devstack nova-compute[86443]: cheetah Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.6 Mai 07 19:30:42 devstack nova-compute[86443]: ast2500-evb Mai 07 19:30:42 devstack nova-compute[86443]: highbank Mai 07 19:30:42 devstack nova-compute[86443]: akita Mai 07 19:30:42 devstack nova-compute[86443]: connex Mai 07 19:30:42 devstack nova-compute[86443]: netduinoplus2 Mai 07 19:30:42 devstack nova-compute[86443]: collie Mai 07 19:30:42 devstack nova-compute[86443]: raspi0 Mai 07 19:30:42 devstack nova-compute[86443]: fp5280g2-bmc Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 64 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-aarch64 Mai 07 19:30:42 devstack nova-compute[86443]: virt-8.2 Mai 07 19:30:42 devstack nova-compute[86443]: virt Mai 07 19:30:42 devstack nova-compute[86443]: qcom-dc-scm-v1-bmc Mai 07 19:30:42 devstack nova-compute[86443]: mori-bmc Mai 07 19:30:42 devstack nova-compute[86443]: ast2600-evb Mai 07 19:30:42 devstack nova-compute[86443]: borzoi Mai 07 19:30:42 devstack nova-compute[86443]: tiogapass-bmc Mai 07 19:30:42 devstack nova-compute[86443]: spitz Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.7 Mai 07 19:30:42 devstack nova-compute[86443]: nuri Mai 07 19:30:42 devstack nova-compute[86443]: mcimx7d-sabre Mai 07 19:30:42 devstack nova-compute[86443]: romulus-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-3.0 Mai 07 19:30:42 devstack nova-compute[86443]: virt-5.0 Mai 07 19:30:42 devstack nova-compute[86443]: npcm750-evb Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.10 Mai 07 19:30:42 devstack nova-compute[86443]: rainier-bmc Mai 07 19:30:42 devstack nova-compute[86443]: mps3-an547 Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.8 Mai 07 19:30:42 devstack nova-compute[86443]: musca-b1 Mai 07 19:30:42 devstack nova-compute[86443]: realview-pbx-a9 Mai 07 19:30:42 devstack nova-compute[86443]: versatileab Mai 07 19:30:42 devstack nova-compute[86443]: kzm Mai 07 19:30:42 devstack nova-compute[86443]: fby35-bmc Mai 07 19:30:42 devstack nova-compute[86443]: musca-a Mai 07 19:30:42 devstack nova-compute[86443]: virt-3.1 Mai 07 19:30:42 devstack nova-compute[86443]: mcimx6ul-evk Mai 07 19:30:42 devstack nova-compute[86443]: virt-5.1 Mai 07 19:30:42 devstack nova-compute[86443]: smdkc210 Mai 07 19:30:42 devstack nova-compute[86443]: sx1 Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.11 Mai 07 19:30:42 devstack nova-compute[86443]: imx25-pdk Mai 07 19:30:42 devstack nova-compute[86443]: stm32vldiscovery Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.9 Mai 07 19:30:42 devstack nova-compute[86443]: orangepi-pc Mai 07 19:30:42 devstack nova-compute[86443]: quanta-q71l-bmc Mai 07 19:30:42 devstack nova-compute[86443]: z2 Mai 07 19:30:42 devstack nova-compute[86443]: virt-5.2 Mai 07 19:30:42 devstack nova-compute[86443]: xilinx-zynq-a9 Mai 07 19:30:42 devstack nova-compute[86443]: xlnx-zcu102 Mai 07 19:30:42 devstack nova-compute[86443]: tosa Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an500 Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.12 Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an521 Mai 07 19:30:42 devstack nova-compute[86443]: sabrelite Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an511 Mai 07 19:30:42 devstack nova-compute[86443]: canon-a1100 Mai 07 19:30:42 devstack nova-compute[86443]: realview-eb Mai 07 19:30:42 devstack nova-compute[86443]: quanta-gbs-bmc Mai 07 19:30:42 devstack nova-compute[86443]: emcraft-sf2 Mai 07 19:30:42 devstack nova-compute[86443]: realview-pb-a8 Mai 07 19:30:42 devstack nova-compute[86443]: sbsa-ref Mai 07 19:30:42 devstack nova-compute[86443]: yosemitev2-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-7.0 Mai 07 19:30:42 devstack nova-compute[86443]: virt-4.0 Mai 07 19:30:42 devstack nova-compute[86443]: raspi1ap Mai 07 19:30:42 devstack nova-compute[86443]: palmetto-bmc Mai 07 19:30:42 devstack nova-compute[86443]: sx1-v1 Mai 07 19:30:42 devstack nova-compute[86443]: n810 Mai 07 19:30:42 devstack nova-compute[86443]: g220a-bmc Mai 07 19:30:42 devstack nova-compute[86443]: n800 Mai 07 19:30:42 devstack nova-compute[86443]: bletchley-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-7.1 Mai 07 19:30:42 devstack nova-compute[86443]: tacoma-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-4.1 Mai 07 19:30:42 devstack nova-compute[86443]: quanta-gsj Mai 07 19:30:42 devstack nova-compute[86443]: versatilepb Mai 07 19:30:42 devstack nova-compute[86443]: terrier Mai 07 19:30:42 devstack nova-compute[86443]: mainstone Mai 07 19:30:42 devstack nova-compute[86443]: realview-eb-mpcore Mai 07 19:30:42 devstack nova-compute[86443]: integratorcp Mai 07 19:30:42 devstack nova-compute[86443]: virt-7.2 Mai 07 19:30:42 devstack nova-compute[86443]: supermicrox11-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-4.2 Mai 07 19:30:42 devstack nova-compute[86443]: witherspoon-bmc Mai 07 19:30:42 devstack nova-compute[86443]: qcom-firework-bmc Mai 07 19:30:42 devstack nova-compute[86443]: mps3-an524 Mai 07 19:30:42 devstack nova-compute[86443]: kudo-bmc Mai 07 19:30:42 devstack nova-compute[86443]: vexpress-a9 Mai 07 19:30:42 devstack nova-compute[86443]: midway Mai 07 19:30:42 devstack nova-compute[86443]: musicpal Mai 07 19:30:42 devstack nova-compute[86443]: lm3s811evb Mai 07 19:30:42 devstack nova-compute[86443]: lm3s6965evb Mai 07 19:30:42 devstack nova-compute[86443]: supermicro-x11spi-bmc Mai 07 19:30:42 devstack nova-compute[86443]: microbit Mai 07 19:30:42 devstack nova-compute[86443]: fby35 Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an505 Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an385 Mai 07 19:30:42 devstack nova-compute[86443]: virt-6.0 Mai 07 19:30:42 devstack nova-compute[86443]: virt-8.0 Mai 07 19:30:42 devstack nova-compute[86443]: raspi3ap Mai 07 19:30:42 devstack nova-compute[86443]: cubieboard Mai 07 19:30:42 devstack nova-compute[86443]: ast1030-evb Mai 07 19:30:42 devstack nova-compute[86443]: verdex Mai 07 19:30:42 devstack nova-compute[86443]: bpim2u Mai 07 19:30:42 devstack nova-compute[86443]: netduino2 Mai 07 19:30:42 devstack nova-compute[86443]: xlnx-versal-virt Mai 07 19:30:42 devstack nova-compute[86443]: mps2-an386 Mai 07 19:30:42 devstack nova-compute[86443]: olimex-stm32-h405 Mai 07 19:30:42 devstack nova-compute[86443]: virt-6.1 Mai 07 19:30:42 devstack nova-compute[86443]: virt-8.1 Mai 07 19:30:42 devstack nova-compute[86443]: raspi3b Mai 07 19:30:42 devstack nova-compute[86443]: raspi2b Mai 07 19:30:42 devstack nova-compute[86443]: vexpress-a15 Mai 07 19:30:42 devstack nova-compute[86443]: fuji-bmc Mai 07 19:30:42 devstack nova-compute[86443]: virt-6.2 Mai 07 19:30:42 devstack nova-compute[86443]: sonorapass-bmc Mai 07 19:30:42 devstack nova-compute[86443]: cheetah Mai 07 19:30:42 devstack nova-compute[86443]: virt-2.6 Mai 07 19:30:42 devstack nova-compute[86443]: ast2500-evb Mai 07 19:30:42 devstack nova-compute[86443]: highbank Mai 07 19:30:42 devstack nova-compute[86443]: akita Mai 07 19:30:42 devstack nova-compute[86443]: connex Mai 07 19:30:42 devstack nova-compute[86443]: netduinoplus2 Mai 07 19:30:42 devstack nova-compute[86443]: collie Mai 07 19:30:42 devstack nova-compute[86443]: raspi0 Mai 07 19:30:42 devstack nova-compute[86443]: fp5280g2-bmc Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 32 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-cris Mai 07 19:30:42 devstack nova-compute[86443]: axis-dev88 Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 32 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-i386 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-noble-v2 Mai 07 19:30:42 devstack nova-compute[86443]: ubuntu Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-mantic Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-impish-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-5.2 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-lunar-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-mantic Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-2.12 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-2.0 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-xenial Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-kinetic Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-6.2 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-4.2 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-mantic-maxcpus Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-2.5 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-4.2 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-focal Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-hirsute Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-xenial Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-jammy-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-5.2 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-2.7 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-eoan-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-disco-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-groovy Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-zesty Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-lunar Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-mantic-hpb-maxcpus Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-groovy Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-7.1 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-artful Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-trusty Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-2.2 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-eoan-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-focal-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-8.1 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-jammy-maxcpus Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-bionic-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-mantic-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-artful Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-8.1 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-2.7 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-6.1 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-kinetic Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-jammy-maxcpus Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-yakkety Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-2.4 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-cosmic-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-7.1 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-2.10 Mai 07 19:30:42 devstack nova-compute[86443]: x-remote Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-5.1 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-2.9 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-2.11 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-jammy-hpb-maxcpus Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-3.1 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-6.1 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-4.1 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-jammy Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-2.4 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-4.1 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-eoan Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-jammy-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-5.1 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-2.9 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-bionic-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-lunar Mai 07 19:30:42 devstack nova-compute[86443]: isapc Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-mantic-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-cosmic Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-2.6 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-mantic-maxcpus Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-3.1 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-bionic Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-disco-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-cosmic Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-2.12 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-bionic Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-kinetic-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-groovy-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-7.0 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-noble-v2 Mai 07 19:30:42 devstack nova-compute[86443]: ubuntu-q35 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-lunar-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-disco Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-cosmic-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-noble Mai 07 19:30:42 devstack nova-compute[86443]: q35 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-2.1 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-8.0 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-impish Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-wily Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-8.0 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-2.6 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-6.0 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-impish Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-jammy Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-impish-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-hirsute Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-4.0.1 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-hirsute-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-7.0 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-5.0 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-2.8 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-2.10 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-3.0 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-6.0 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-zesty Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-7.2 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-4.0 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-focal Mai 07 19:30:42 devstack nova-compute[86443]: microvm Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-2.3 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-jammy-hpb-maxcpus Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-8.2 Mai 07 19:30:42 devstack nova-compute[86443]: q35 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-kinetic-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-focal-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-4.0 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-noble Mai 07 19:30:42 devstack nova-compute[86443]: pc Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-disco Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-groovy-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-8.2 Mai 07 19:30:42 devstack nova-compute[86443]: pc Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-hirsute-hpb Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-5.0 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-6.2 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-2.8 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-eoan Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-2.5 Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-3.0 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-yakkety Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-mantic-hpb-maxcpus Mai 07 19:30:42 devstack nova-compute[86443]: pc-i440fx-7.2 Mai 07 19:30:42 devstack nova-compute[86443]: pc-q35-2.11 Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 32 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-m68k Mai 07 19:30:42 devstack nova-compute[86443]: mcf5208evb Mai 07 19:30:42 devstack nova-compute[86443]: virt-7.0 Mai 07 19:30:42 devstack nova-compute[86443]: an5206 Mai 07 19:30:42 devstack nova-compute[86443]: virt-6.0 Mai 07 19:30:42 devstack nova-compute[86443]: q800 Mai 07 19:30:42 devstack nova-compute[86443]: virt-8.1 Mai 07 19:30:42 devstack nova-compute[86443]: virt-7.2 Mai 07 19:30:42 devstack nova-compute[86443]: virt-6.2 Mai 07 19:30:42 devstack nova-compute[86443]: virt-8.0 Mai 07 19:30:42 devstack nova-compute[86443]: next-cube Mai 07 19:30:42 devstack nova-compute[86443]: virt-7.1 Mai 07 19:30:42 devstack nova-compute[86443]: virt-6.1 Mai 07 19:30:42 devstack nova-compute[86443]: virt-8.2 Mai 07 19:30:42 devstack nova-compute[86443]: virt Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 32 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-microblaze Mai 07 19:30:42 devstack nova-compute[86443]: petalogix-s3adsp1800 Mai 07 19:30:42 devstack nova-compute[86443]: petalogix-ml605 Mai 07 19:30:42 devstack nova-compute[86443]: xlnx-zynqmp-pmu Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 32 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-microblazeel Mai 07 19:30:42 devstack nova-compute[86443]: petalogix-s3adsp1800 Mai 07 19:30:42 devstack nova-compute[86443]: petalogix-ml605 Mai 07 19:30:42 devstack nova-compute[86443]: xlnx-zynqmp-pmu Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 32 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-mips Mai 07 19:30:42 devstack nova-compute[86443]: malta Mai 07 19:30:42 devstack nova-compute[86443]: mipssim Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 32 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-mipsel Mai 07 19:30:42 devstack nova-compute[86443]: malta Mai 07 19:30:42 devstack nova-compute[86443]: mipssim Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 64 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-mips64 Mai 07 19:30:42 devstack nova-compute[86443]: malta Mai 07 19:30:42 devstack nova-compute[86443]: mipssim Mai 07 19:30:42 devstack nova-compute[86443]: pica61 Mai 07 19:30:42 devstack nova-compute[86443]: magnum Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 64 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-mips64el Mai 07 19:30:42 devstack nova-compute[86443]: malta Mai 07 19:30:42 devstack nova-compute[86443]: loongson3-virt Mai 07 19:30:42 devstack nova-compute[86443]: mipssim Mai 07 19:30:42 devstack nova-compute[86443]: pica61 Mai 07 19:30:42 devstack nova-compute[86443]: magnum Mai 07 19:30:42 devstack nova-compute[86443]: boston Mai 07 19:30:42 devstack nova-compute[86443]: fuloong2e Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 32 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-ppc Mai 07 19:30:42 devstack nova-compute[86443]: g3beige Mai 07 19:30:42 devstack nova-compute[86443]: amigaone Mai 07 19:30:42 devstack nova-compute[86443]: virtex-ml507 Mai 07 19:30:42 devstack nova-compute[86443]: mac99 Mai 07 19:30:42 devstack nova-compute[86443]: ppce500 Mai 07 19:30:42 devstack nova-compute[86443]: sam460ex Mai 07 19:30:42 devstack nova-compute[86443]: pegasos2 Mai 07 19:30:42 devstack nova-compute[86443]: bamboo Mai 07 19:30:42 devstack nova-compute[86443]: 40p Mai 07 19:30:42 devstack nova-compute[86443]: ref405ep Mai 07 19:30:42 devstack nova-compute[86443]: mpc8544ds Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 64 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-ppc64 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-noble Mai 07 19:30:42 devstack nova-compute[86443]: pseries Mai 07 19:30:42 devstack nova-compute[86443]: amigaone Mai 07 19:30:42 devstack nova-compute[86443]: powernv9 Mai 07 19:30:42 devstack nova-compute[86443]: powernv Mai 07 19:30:42 devstack nova-compute[86443]: pseries-4.1 Mai 07 19:30:42 devstack nova-compute[86443]: mpc8544ds Mai 07 19:30:42 devstack nova-compute[86443]: pseries-6.1 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.5 Mai 07 19:30:42 devstack nova-compute[86443]: powernv10 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-xenial Mai 07 19:30:42 devstack nova-compute[86443]: pseries-4.2 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-6.2 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-yakkety Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.6 Mai 07 19:30:42 devstack nova-compute[86443]: ppce500 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-bionic-sxxm Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.7 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-3.0 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-8.0 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-5.0 Mai 07 19:30:42 devstack nova-compute[86443]: 40p Mai 07 19:30:42 devstack nova-compute[86443]: pseries-lunar Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.8 Mai 07 19:30:42 devstack nova-compute[86443]: pegasos2 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-hirsute Mai 07 19:30:42 devstack nova-compute[86443]: pseries-3.1 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-8.1 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-5.1 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-eoan Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.9 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-zesty Mai 07 19:30:42 devstack nova-compute[86443]: bamboo Mai 07 19:30:42 devstack nova-compute[86443]: pseries-groovy Mai 07 19:30:42 devstack nova-compute[86443]: pseries-focal Mai 07 19:30:42 devstack nova-compute[86443]: g3beige Mai 07 19:30:42 devstack nova-compute[86443]: pseries-8.2 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-5.2 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-disco Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.12-sxxm Mai 07 19:30:42 devstack nova-compute[86443]: pseries-mantic Mai 07 19:30:42 devstack nova-compute[86443]: pseries-kinetic Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.10 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-7.0 Mai 07 19:30:42 devstack nova-compute[86443]: virtex-ml507 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.11 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.1 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-7.1 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-cosmic Mai 07 19:30:42 devstack nova-compute[86443]: pseries-bionic Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.12 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.2 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-7.2 Mai 07 19:30:42 devstack nova-compute[86443]: mac99 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-impish Mai 07 19:30:42 devstack nova-compute[86443]: pseries-jammy Mai 07 19:30:42 devstack nova-compute[86443]: pseries-artful Mai 07 19:30:42 devstack nova-compute[86443]: sam460ex Mai 07 19:30:42 devstack nova-compute[86443]: ref405ep Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.3 Mai 07 19:30:42 devstack nova-compute[86443]: powernv8 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-4.0 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-6.0 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.4 Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 64 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-ppc64le Mai 07 19:30:42 devstack nova-compute[86443]: pseries-noble Mai 07 19:30:42 devstack nova-compute[86443]: pseries Mai 07 19:30:42 devstack nova-compute[86443]: amigaone Mai 07 19:30:42 devstack nova-compute[86443]: powernv9 Mai 07 19:30:42 devstack nova-compute[86443]: powernv Mai 07 19:30:42 devstack nova-compute[86443]: pseries-4.1 Mai 07 19:30:42 devstack nova-compute[86443]: mpc8544ds Mai 07 19:30:42 devstack nova-compute[86443]: pseries-6.1 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.5 Mai 07 19:30:42 devstack nova-compute[86443]: powernv10 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-xenial Mai 07 19:30:42 devstack nova-compute[86443]: pseries-4.2 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-6.2 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-yakkety Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.6 Mai 07 19:30:42 devstack nova-compute[86443]: ppce500 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-bionic-sxxm Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.7 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-3.0 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-8.0 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-5.0 Mai 07 19:30:42 devstack nova-compute[86443]: 40p Mai 07 19:30:42 devstack nova-compute[86443]: pseries-lunar Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.8 Mai 07 19:30:42 devstack nova-compute[86443]: pegasos2 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-hirsute Mai 07 19:30:42 devstack nova-compute[86443]: pseries-3.1 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-8.1 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-5.1 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-eoan Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.9 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-zesty Mai 07 19:30:42 devstack nova-compute[86443]: bamboo Mai 07 19:30:42 devstack nova-compute[86443]: pseries-groovy Mai 07 19:30:42 devstack nova-compute[86443]: pseries-focal Mai 07 19:30:42 devstack nova-compute[86443]: g3beige Mai 07 19:30:42 devstack nova-compute[86443]: pseries-8.2 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-5.2 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-disco Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.12-sxxm Mai 07 19:30:42 devstack nova-compute[86443]: pseries-mantic Mai 07 19:30:42 devstack nova-compute[86443]: pseries-kinetic Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.10 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-7.0 Mai 07 19:30:42 devstack nova-compute[86443]: virtex-ml507 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.11 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.1 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-7.1 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-cosmic Mai 07 19:30:42 devstack nova-compute[86443]: pseries-bionic Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.12 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.2 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-7.2 Mai 07 19:30:42 devstack nova-compute[86443]: mac99 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-impish Mai 07 19:30:42 devstack nova-compute[86443]: pseries-jammy Mai 07 19:30:42 devstack nova-compute[86443]: pseries-artful Mai 07 19:30:42 devstack nova-compute[86443]: sam460ex Mai 07 19:30:42 devstack nova-compute[86443]: ref405ep Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.3 Mai 07 19:30:42 devstack nova-compute[86443]: powernv8 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-4.0 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-6.0 Mai 07 19:30:42 devstack nova-compute[86443]: pseries-2.4 Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 32 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-riscv32 Mai 07 19:30:42 devstack nova-compute[86443]: virt Mai 07 19:30:42 devstack nova-compute[86443]: spike Mai 07 19:30:42 devstack nova-compute[86443]: opentitan Mai 07 19:30:42 devstack nova-compute[86443]: sifive_u Mai 07 19:30:42 devstack nova-compute[86443]: sifive_e Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 64 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-riscv64 Mai 07 19:30:42 devstack nova-compute[86443]: virt Mai 07 19:30:42 devstack nova-compute[86443]: spike Mai 07 19:30:42 devstack nova-compute[86443]: microchip-icicle-kit Mai 07 19:30:42 devstack nova-compute[86443]: sifive_u Mai 07 19:30:42 devstack nova-compute[86443]: shakti_c Mai 07 19:30:42 devstack nova-compute[86443]: sifive_e Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 64 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-s390x Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-noble Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-8.0 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-disco Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-6.0 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-2.11 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-7.0 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-mantic Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-xenial Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-5.0 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-focal Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-2.8 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-4.0 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-zesty Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-8.2 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-6.2 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-3.0 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-groovy Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-2.5 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-7.2 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-5.2 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-artful Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-hirsute Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-4.2 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-2.10 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-2.7 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-lunar Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-eoan Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-kinetic Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-8.1 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-yakkety Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-6.1 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-cosmic Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-bionic Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-2.12 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-7.1 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-2.4 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-5.1 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-2.9 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-impish Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-4.1 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-jammy Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-3.1 Mai 07 19:30:42 devstack nova-compute[86443]: s390-ccw-virtio-2.6 Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 32 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-sh4 Mai 07 19:30:42 devstack nova-compute[86443]: shix Mai 07 19:30:42 devstack nova-compute[86443]: r2d Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: hvm Mai 07 19:30:42 devstack nova-compute[86443]: Mai 07 19:30:42 devstack nova-compute[86443]: 64 Mai 07 19:30:42 devstack nova-compute[86443]: /usr/bin/qemu-system-sh4eb Mai 07 19:30:43 devstack nova-compute[86443]: shix Mai 07 19:30:43 devstack nova-compute[86443]: r2d Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: hvm Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: 32 Mai 07 19:30:43 devstack nova-compute[86443]: /usr/bin/qemu-system-sparc Mai 07 19:30:43 devstack nova-compute[86443]: SS-5 Mai 07 19:30:43 devstack nova-compute[86443]: SS-20 Mai 07 19:30:43 devstack nova-compute[86443]: LX Mai 07 19:30:43 devstack nova-compute[86443]: SPARCClassic Mai 07 19:30:43 devstack nova-compute[86443]: leon3_generic Mai 07 19:30:43 devstack nova-compute[86443]: SPARCbook Mai 07 19:30:43 devstack nova-compute[86443]: SS-4 Mai 07 19:30:43 devstack nova-compute[86443]: SS-600MP Mai 07 19:30:43 devstack nova-compute[86443]: SS-10 Mai 07 19:30:43 devstack nova-compute[86443]: Voyager Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: hvm Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: 64 Mai 07 19:30:43 devstack nova-compute[86443]: /usr/bin/qemu-system-sparc64 Mai 07 19:30:43 devstack nova-compute[86443]: sun4u Mai 07 19:30:43 devstack nova-compute[86443]: niagara Mai 07 19:30:43 devstack nova-compute[86443]: sun4v Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: hvm Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: 64 Mai 07 19:30:43 devstack nova-compute[86443]: /usr/bin/qemu-system-x86_64 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-noble-v2 Mai 07 19:30:43 devstack nova-compute[86443]: ubuntu Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-mantic Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-impish-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-5.2 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-lunar-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-mantic Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-2.12 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-2.0 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-xenial Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-kinetic Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-6.2 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-4.2 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-mantic-maxcpus Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-2.5 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-4.2 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-focal Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-hirsute Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-xenial Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-jammy-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-5.2 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-2.7 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-eoan-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-zesty Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-groovy Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-disco-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-lunar Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-mantic-hpb-maxcpus Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-groovy Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-7.1 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-artful Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-trusty Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-2.2 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-focal-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-8.1 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-eoan-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-jammy-maxcpus Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-bionic-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-mantic-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-artful Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-8.1 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-2.7 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-6.1 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-kinetic Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-jammy-maxcpus Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-yakkety Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-2.4 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-cosmic-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-7.1 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-2.10 Mai 07 19:30:43 devstack nova-compute[86443]: x-remote Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-5.1 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-2.9 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-2.11 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-jammy-hpb-maxcpus Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-3.1 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-6.1 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-4.1 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-jammy Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-2.4 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-4.1 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-eoan Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-jammy-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-5.1 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-2.9 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-bionic-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-lunar Mai 07 19:30:43 devstack nova-compute[86443]: isapc Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-mantic-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-cosmic Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-2.6 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-mantic-maxcpus Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-3.1 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-bionic Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-disco-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-cosmic Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-2.12 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-bionic Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-kinetic-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-groovy-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-7.0 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-noble-v2 Mai 07 19:30:43 devstack nova-compute[86443]: ubuntu-q35 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-lunar-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-disco Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-cosmic-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-2.1 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-noble Mai 07 19:30:43 devstack nova-compute[86443]: q35 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-8.0 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-wily Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-impish Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-8.0 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-2.6 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-6.0 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-impish Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-jammy Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-impish-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-hirsute Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-4.0.1 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-hirsute-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-7.0 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-5.0 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-2.8 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-2.10 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-3.0 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-zesty Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-7.2 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-4.0 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-6.0 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-focal Mai 07 19:30:43 devstack nova-compute[86443]: microvm Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-2.3 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-jammy-hpb-maxcpus Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-disco Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-kinetic-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-4.0 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-focal-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-noble Mai 07 19:30:43 devstack nova-compute[86443]: pc Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-8.2 Mai 07 19:30:43 devstack nova-compute[86443]: q35 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-groovy-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-hirsute-hpb Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-8.2 Mai 07 19:30:43 devstack nova-compute[86443]: pc Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-5.0 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-2.8 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-6.2 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-eoan Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-2.5 Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-3.0 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-yakkety Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-mantic-hpb-maxcpus Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-7.2 Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-2.11 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: hvm Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: 32 Mai 07 19:30:43 devstack nova-compute[86443]: /usr/bin/qemu-system-xtensa Mai 07 19:30:43 devstack nova-compute[86443]: sim Mai 07 19:30:43 devstack nova-compute[86443]: kc705 Mai 07 19:30:43 devstack nova-compute[86443]: ml605 Mai 07 19:30:43 devstack nova-compute[86443]: ml605-nommu Mai 07 19:30:43 devstack nova-compute[86443]: virt Mai 07 19:30:43 devstack nova-compute[86443]: lx60-nommu Mai 07 19:30:43 devstack nova-compute[86443]: lx200 Mai 07 19:30:43 devstack nova-compute[86443]: lx200-nommu Mai 07 19:30:43 devstack nova-compute[86443]: lx60 Mai 07 19:30:43 devstack nova-compute[86443]: kc705-nommu Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: hvm Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: 32 Mai 07 19:30:43 devstack nova-compute[86443]: /usr/bin/qemu-system-xtensaeb Mai 07 19:30:43 devstack nova-compute[86443]: sim Mai 07 19:30:43 devstack nova-compute[86443]: kc705 Mai 07 19:30:43 devstack nova-compute[86443]: ml605 Mai 07 19:30:43 devstack nova-compute[86443]: ml605-nommu Mai 07 19:30:43 devstack nova-compute[86443]: virt Mai 07 19:30:43 devstack nova-compute[86443]: lx60-nommu Mai 07 19:30:43 devstack nova-compute[86443]: lx200 Mai 07 19:30:43 devstack nova-compute[86443]: lx200-nommu Mai 07 19:30:43 devstack nova-compute[86443]: lx60 Mai 07 19:30:43 devstack nova-compute[86443]: kc705-nommu Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for alpha via machine types: {None} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch alpha / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-alpha' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for armv6l via machine types: {None, 'virt'} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for armv7l via machine types: {'virt'} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch armv7l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for aarch64 via machine types: {'virt'} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch aarch64 / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-aarch64' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for cris via machine types: {None} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch cris / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-cris' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for i686 via machine types: {'pc', 'q35', 'ubuntu', 'ubuntu-q35'} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: /usr/bin/qemu-system-i386 Mai 07 19:30:43 devstack nova-compute[86443]: kvm Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-noble Mai 07 19:30:43 devstack nova-compute[86443]: i686 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/AAVMF/AAVMF_CODE.fd Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/AAVMF/AAVMF32_CODE.fd Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: rom Mai 07 19:30:43 devstack nova-compute[86443]: pflash Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: yes Mai 07 19:30:43 devstack nova-compute[86443]: no Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: no Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: off Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: on Mai 07 19:30:43 devstack nova-compute[86443]: off Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-IBPB Mai 07 19:30:43 devstack nova-compute[86443]: AMD Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: qemu64 Mai 07 19:30:43 devstack nova-compute[86443]: qemu32 Mai 07 19:30:43 devstack nova-compute[86443]: phenom Mai 07 19:30:43 devstack nova-compute[86443]: pentium3 Mai 07 19:30:43 devstack nova-compute[86443]: pentium2 Mai 07 19:30:43 devstack nova-compute[86443]: pentium Mai 07 19:30:43 devstack nova-compute[86443]: n270 Mai 07 19:30:43 devstack nova-compute[86443]: kvm64 Mai 07 19:30:43 devstack nova-compute[86443]: kvm32 Mai 07 19:30:43 devstack nova-compute[86443]: coreduo Mai 07 19:30:43 devstack nova-compute[86443]: core2duo Mai 07 19:30:43 devstack nova-compute[86443]: athlon Mai 07 19:30:43 devstack nova-compute[86443]: Westmere-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Westmere Mai 07 19:30:43 devstack nova-compute[86443]: Snowridge Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client Mai 07 19:30:43 devstack nova-compute[86443]: SapphireRapids Mai 07 19:30:43 devstack nova-compute[86443]: SandyBridge-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: SandyBridge Mai 07 19:30:43 devstack nova-compute[86443]: Penryn Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G5 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G4 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G3 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G2 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G1 Mai 07 19:30:43 devstack nova-compute[86443]: Nehalem-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Nehalem Mai 07 19:30:43 devstack nova-compute[86443]: IvyBridge-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: IvyBridge Mai 07 19:30:43 devstack nova-compute[86443]: Icelake-Server-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Icelake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Haswell Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Rome Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Milan Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-IBPB Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Genoa Mai 07 19:30:43 devstack nova-compute[86443]: EPYC Mai 07 19:30:43 devstack nova-compute[86443]: Dhyana Mai 07 19:30:43 devstack nova-compute[86443]: Cooperlake Mai 07 19:30:43 devstack nova-compute[86443]: Conroe Mai 07 19:30:43 devstack nova-compute[86443]: Cascadelake-Server-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Cascadelake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell Mai 07 19:30:43 devstack nova-compute[86443]: 486 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: file Mai 07 19:30:43 devstack nova-compute[86443]: anonymous Mai 07 19:30:43 devstack nova-compute[86443]: memfd Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: disk Mai 07 19:30:43 devstack nova-compute[86443]: cdrom Mai 07 19:30:43 devstack nova-compute[86443]: floppy Mai 07 19:30:43 devstack nova-compute[86443]: lun Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: ide Mai 07 19:30:43 devstack nova-compute[86443]: fdc Mai 07 19:30:43 devstack nova-compute[86443]: scsi Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: sata Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: virtio-transitional Mai 07 19:30:43 devstack nova-compute[86443]: virtio-non-transitional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: sdl Mai 07 19:30:43 devstack nova-compute[86443]: vnc Mai 07 19:30:43 devstack nova-compute[86443]: spice Mai 07 19:30:43 devstack nova-compute[86443]: egl-headless Mai 07 19:30:43 devstack nova-compute[86443]: dbus Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: subsystem Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: default Mai 07 19:30:43 devstack nova-compute[86443]: mandatory Mai 07 19:30:43 devstack nova-compute[86443]: requisite Mai 07 19:30:43 devstack nova-compute[86443]: optional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: pci Mai 07 19:30:43 devstack nova-compute[86443]: scsi Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: virtio-transitional Mai 07 19:30:43 devstack nova-compute[86443]: virtio-non-transitional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: random Mai 07 19:30:43 devstack nova-compute[86443]: egd Mai 07 19:30:43 devstack nova-compute[86443]: builtin Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: path Mai 07 19:30:43 devstack nova-compute[86443]: handle Mai 07 19:30:43 devstack nova-compute[86443]: virtiofs Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: tpm-tis Mai 07 19:30:43 devstack nova-compute[86443]: tpm-crb Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: passthrough Mai 07 19:30:43 devstack nova-compute[86443]: emulator Mai 07 19:30:43 devstack nova-compute[86443]: external Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: 1.2 Mai 07 19:30:43 devstack nova-compute[86443]: 2.0 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: pty Mai 07 19:30:43 devstack nova-compute[86443]: unix Mai 07 19:30:43 devstack nova-compute[86443]: spicevmc Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: qemu Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: builtin Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: relaxed Mai 07 19:30:43 devstack nova-compute[86443]: vapic Mai 07 19:30:43 devstack nova-compute[86443]: spinlocks Mai 07 19:30:43 devstack nova-compute[86443]: vpindex Mai 07 19:30:43 devstack nova-compute[86443]: runtime Mai 07 19:30:43 devstack nova-compute[86443]: synic Mai 07 19:30:43 devstack nova-compute[86443]: stimer Mai 07 19:30:43 devstack nova-compute[86443]: reset Mai 07 19:30:43 devstack nova-compute[86443]: vendor_id Mai 07 19:30:43 devstack nova-compute[86443]: frequencies Mai 07 19:30:43 devstack nova-compute[86443]: reenlightenment Mai 07 19:30:43 devstack nova-compute[86443]: tlbflush Mai 07 19:30:43 devstack nova-compute[86443]: ipi Mai 07 19:30:43 devstack nova-compute[86443]: avic Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: {{(pid=86443) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1105}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: /usr/bin/qemu-system-i386 Mai 07 19:30:43 devstack nova-compute[86443]: kvm Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-noble Mai 07 19:30:43 devstack nova-compute[86443]: i686 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/AAVMF/AAVMF_CODE.fd Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/AAVMF/AAVMF32_CODE.fd Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: rom Mai 07 19:30:43 devstack nova-compute[86443]: pflash Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: yes Mai 07 19:30:43 devstack nova-compute[86443]: no Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: no Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: off Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: on Mai 07 19:30:43 devstack nova-compute[86443]: off Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-IBPB Mai 07 19:30:43 devstack nova-compute[86443]: AMD Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: qemu64 Mai 07 19:30:43 devstack nova-compute[86443]: qemu32 Mai 07 19:30:43 devstack nova-compute[86443]: phenom Mai 07 19:30:43 devstack nova-compute[86443]: pentium3 Mai 07 19:30:43 devstack nova-compute[86443]: pentium2 Mai 07 19:30:43 devstack nova-compute[86443]: pentium Mai 07 19:30:43 devstack nova-compute[86443]: n270 Mai 07 19:30:43 devstack nova-compute[86443]: kvm64 Mai 07 19:30:43 devstack nova-compute[86443]: kvm32 Mai 07 19:30:43 devstack nova-compute[86443]: coreduo Mai 07 19:30:43 devstack nova-compute[86443]: core2duo Mai 07 19:30:43 devstack nova-compute[86443]: athlon Mai 07 19:30:43 devstack nova-compute[86443]: Westmere-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Westmere Mai 07 19:30:43 devstack nova-compute[86443]: Snowridge Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client Mai 07 19:30:43 devstack nova-compute[86443]: SapphireRapids Mai 07 19:30:43 devstack nova-compute[86443]: SandyBridge-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: SandyBridge Mai 07 19:30:43 devstack nova-compute[86443]: Penryn Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G5 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G4 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G3 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G2 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G1 Mai 07 19:30:43 devstack nova-compute[86443]: Nehalem-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Nehalem Mai 07 19:30:43 devstack nova-compute[86443]: IvyBridge-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: IvyBridge Mai 07 19:30:43 devstack nova-compute[86443]: Icelake-Server-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Icelake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Haswell Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Rome Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Milan Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-IBPB Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Genoa Mai 07 19:30:43 devstack nova-compute[86443]: EPYC Mai 07 19:30:43 devstack nova-compute[86443]: Dhyana Mai 07 19:30:43 devstack nova-compute[86443]: Cooperlake Mai 07 19:30:43 devstack nova-compute[86443]: Conroe Mai 07 19:30:43 devstack nova-compute[86443]: Cascadelake-Server-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Cascadelake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell Mai 07 19:30:43 devstack nova-compute[86443]: 486 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: file Mai 07 19:30:43 devstack nova-compute[86443]: anonymous Mai 07 19:30:43 devstack nova-compute[86443]: memfd Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: disk Mai 07 19:30:43 devstack nova-compute[86443]: cdrom Mai 07 19:30:43 devstack nova-compute[86443]: floppy Mai 07 19:30:43 devstack nova-compute[86443]: lun Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: fdc Mai 07 19:30:43 devstack nova-compute[86443]: scsi Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: sata Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: virtio-transitional Mai 07 19:30:43 devstack nova-compute[86443]: virtio-non-transitional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: sdl Mai 07 19:30:43 devstack nova-compute[86443]: vnc Mai 07 19:30:43 devstack nova-compute[86443]: spice Mai 07 19:30:43 devstack nova-compute[86443]: egl-headless Mai 07 19:30:43 devstack nova-compute[86443]: dbus Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: subsystem Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: default Mai 07 19:30:43 devstack nova-compute[86443]: mandatory Mai 07 19:30:43 devstack nova-compute[86443]: requisite Mai 07 19:30:43 devstack nova-compute[86443]: optional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: pci Mai 07 19:30:43 devstack nova-compute[86443]: scsi Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: virtio-transitional Mai 07 19:30:43 devstack nova-compute[86443]: virtio-non-transitional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: random Mai 07 19:30:43 devstack nova-compute[86443]: egd Mai 07 19:30:43 devstack nova-compute[86443]: builtin Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: path Mai 07 19:30:43 devstack nova-compute[86443]: handle Mai 07 19:30:43 devstack nova-compute[86443]: virtiofs Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: tpm-tis Mai 07 19:30:43 devstack nova-compute[86443]: tpm-crb Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: passthrough Mai 07 19:30:43 devstack nova-compute[86443]: emulator Mai 07 19:30:43 devstack nova-compute[86443]: external Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: 1.2 Mai 07 19:30:43 devstack nova-compute[86443]: 2.0 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: pty Mai 07 19:30:43 devstack nova-compute[86443]: unix Mai 07 19:30:43 devstack nova-compute[86443]: spicevmc Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: qemu Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: builtin Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: relaxed Mai 07 19:30:43 devstack nova-compute[86443]: vapic Mai 07 19:30:43 devstack nova-compute[86443]: spinlocks Mai 07 19:30:43 devstack nova-compute[86443]: vpindex Mai 07 19:30:43 devstack nova-compute[86443]: runtime Mai 07 19:30:43 devstack nova-compute[86443]: synic Mai 07 19:30:43 devstack nova-compute[86443]: stimer Mai 07 19:30:43 devstack nova-compute[86443]: reset Mai 07 19:30:43 devstack nova-compute[86443]: vendor_id Mai 07 19:30:43 devstack nova-compute[86443]: frequencies Mai 07 19:30:43 devstack nova-compute[86443]: reenlightenment Mai 07 19:30:43 devstack nova-compute[86443]: tlbflush Mai 07 19:30:43 devstack nova-compute[86443]: ipi Mai 07 19:30:43 devstack nova-compute[86443]: avic Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: {{(pid=86443) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1105}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: /usr/bin/qemu-system-i386 Mai 07 19:30:43 devstack nova-compute[86443]: kvm Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-noble-v2 Mai 07 19:30:43 devstack nova-compute[86443]: i686 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/AAVMF/AAVMF_CODE.fd Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/AAVMF/AAVMF32_CODE.fd Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: rom Mai 07 19:30:43 devstack nova-compute[86443]: pflash Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: yes Mai 07 19:30:43 devstack nova-compute[86443]: no Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: no Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: off Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: on Mai 07 19:30:43 devstack nova-compute[86443]: off Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-IBPB Mai 07 19:30:43 devstack nova-compute[86443]: AMD Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: qemu64 Mai 07 19:30:43 devstack nova-compute[86443]: qemu32 Mai 07 19:30:43 devstack nova-compute[86443]: phenom Mai 07 19:30:43 devstack nova-compute[86443]: pentium3 Mai 07 19:30:43 devstack nova-compute[86443]: pentium2 Mai 07 19:30:43 devstack nova-compute[86443]: pentium Mai 07 19:30:43 devstack nova-compute[86443]: n270 Mai 07 19:30:43 devstack nova-compute[86443]: kvm64 Mai 07 19:30:43 devstack nova-compute[86443]: kvm32 Mai 07 19:30:43 devstack nova-compute[86443]: coreduo Mai 07 19:30:43 devstack nova-compute[86443]: core2duo Mai 07 19:30:43 devstack nova-compute[86443]: athlon Mai 07 19:30:43 devstack nova-compute[86443]: Westmere-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Westmere Mai 07 19:30:43 devstack nova-compute[86443]: Snowridge Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client Mai 07 19:30:43 devstack nova-compute[86443]: SapphireRapids Mai 07 19:30:43 devstack nova-compute[86443]: SandyBridge-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: SandyBridge Mai 07 19:30:43 devstack nova-compute[86443]: Penryn Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G5 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G4 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G3 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G2 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G1 Mai 07 19:30:43 devstack nova-compute[86443]: Nehalem-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Nehalem Mai 07 19:30:43 devstack nova-compute[86443]: IvyBridge-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: IvyBridge Mai 07 19:30:43 devstack nova-compute[86443]: Icelake-Server-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Icelake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Haswell Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Rome Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Milan Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-IBPB Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Genoa Mai 07 19:30:43 devstack nova-compute[86443]: EPYC Mai 07 19:30:43 devstack nova-compute[86443]: Dhyana Mai 07 19:30:43 devstack nova-compute[86443]: Cooperlake Mai 07 19:30:43 devstack nova-compute[86443]: Conroe Mai 07 19:30:43 devstack nova-compute[86443]: Cascadelake-Server-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Cascadelake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell Mai 07 19:30:43 devstack nova-compute[86443]: 486 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: file Mai 07 19:30:43 devstack nova-compute[86443]: anonymous Mai 07 19:30:43 devstack nova-compute[86443]: memfd Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: disk Mai 07 19:30:43 devstack nova-compute[86443]: cdrom Mai 07 19:30:43 devstack nova-compute[86443]: floppy Mai 07 19:30:43 devstack nova-compute[86443]: lun Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: ide Mai 07 19:30:43 devstack nova-compute[86443]: fdc Mai 07 19:30:43 devstack nova-compute[86443]: scsi Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: sata Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: virtio-transitional Mai 07 19:30:43 devstack nova-compute[86443]: virtio-non-transitional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: sdl Mai 07 19:30:43 devstack nova-compute[86443]: vnc Mai 07 19:30:43 devstack nova-compute[86443]: spice Mai 07 19:30:43 devstack nova-compute[86443]: egl-headless Mai 07 19:30:43 devstack nova-compute[86443]: dbus Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: subsystem Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: default Mai 07 19:30:43 devstack nova-compute[86443]: mandatory Mai 07 19:30:43 devstack nova-compute[86443]: requisite Mai 07 19:30:43 devstack nova-compute[86443]: optional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: pci Mai 07 19:30:43 devstack nova-compute[86443]: scsi Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: virtio-transitional Mai 07 19:30:43 devstack nova-compute[86443]: virtio-non-transitional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: random Mai 07 19:30:43 devstack nova-compute[86443]: egd Mai 07 19:30:43 devstack nova-compute[86443]: builtin Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: path Mai 07 19:30:43 devstack nova-compute[86443]: handle Mai 07 19:30:43 devstack nova-compute[86443]: virtiofs Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: tpm-tis Mai 07 19:30:43 devstack nova-compute[86443]: tpm-crb Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: passthrough Mai 07 19:30:43 devstack nova-compute[86443]: emulator Mai 07 19:30:43 devstack nova-compute[86443]: external Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: 1.2 Mai 07 19:30:43 devstack nova-compute[86443]: 2.0 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: pty Mai 07 19:30:43 devstack nova-compute[86443]: unix Mai 07 19:30:43 devstack nova-compute[86443]: spicevmc Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: qemu Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: builtin Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: relaxed Mai 07 19:30:43 devstack nova-compute[86443]: vapic Mai 07 19:30:43 devstack nova-compute[86443]: spinlocks Mai 07 19:30:43 devstack nova-compute[86443]: vpindex Mai 07 19:30:43 devstack nova-compute[86443]: runtime Mai 07 19:30:43 devstack nova-compute[86443]: synic Mai 07 19:30:43 devstack nova-compute[86443]: stimer Mai 07 19:30:43 devstack nova-compute[86443]: reset Mai 07 19:30:43 devstack nova-compute[86443]: vendor_id Mai 07 19:30:43 devstack nova-compute[86443]: frequencies Mai 07 19:30:43 devstack nova-compute[86443]: reenlightenment Mai 07 19:30:43 devstack nova-compute[86443]: tlbflush Mai 07 19:30:43 devstack nova-compute[86443]: ipi Mai 07 19:30:43 devstack nova-compute[86443]: avic Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: {{(pid=86443) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1105}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu-q35: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: /usr/bin/qemu-system-i386 Mai 07 19:30:43 devstack nova-compute[86443]: kvm Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-noble-v2 Mai 07 19:30:43 devstack nova-compute[86443]: i686 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/AAVMF/AAVMF_CODE.fd Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/AAVMF/AAVMF32_CODE.fd Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: rom Mai 07 19:30:43 devstack nova-compute[86443]: pflash Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: yes Mai 07 19:30:43 devstack nova-compute[86443]: no Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: no Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: off Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: on Mai 07 19:30:43 devstack nova-compute[86443]: off Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-IBPB Mai 07 19:30:43 devstack nova-compute[86443]: AMD Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: qemu64 Mai 07 19:30:43 devstack nova-compute[86443]: qemu32 Mai 07 19:30:43 devstack nova-compute[86443]: phenom Mai 07 19:30:43 devstack nova-compute[86443]: pentium3 Mai 07 19:30:43 devstack nova-compute[86443]: pentium2 Mai 07 19:30:43 devstack nova-compute[86443]: pentium Mai 07 19:30:43 devstack nova-compute[86443]: n270 Mai 07 19:30:43 devstack nova-compute[86443]: kvm64 Mai 07 19:30:43 devstack nova-compute[86443]: kvm32 Mai 07 19:30:43 devstack nova-compute[86443]: coreduo Mai 07 19:30:43 devstack nova-compute[86443]: core2duo Mai 07 19:30:43 devstack nova-compute[86443]: athlon Mai 07 19:30:43 devstack nova-compute[86443]: Westmere-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Westmere Mai 07 19:30:43 devstack nova-compute[86443]: Snowridge Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client Mai 07 19:30:43 devstack nova-compute[86443]: SapphireRapids Mai 07 19:30:43 devstack nova-compute[86443]: SandyBridge-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: SandyBridge Mai 07 19:30:43 devstack nova-compute[86443]: Penryn Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G5 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G4 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G3 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G2 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G1 Mai 07 19:30:43 devstack nova-compute[86443]: Nehalem-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Nehalem Mai 07 19:30:43 devstack nova-compute[86443]: IvyBridge-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: IvyBridge Mai 07 19:30:43 devstack nova-compute[86443]: Icelake-Server-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Icelake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Haswell Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Rome Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Milan Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-IBPB Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Genoa Mai 07 19:30:43 devstack nova-compute[86443]: EPYC Mai 07 19:30:43 devstack nova-compute[86443]: Dhyana Mai 07 19:30:43 devstack nova-compute[86443]: Cooperlake Mai 07 19:30:43 devstack nova-compute[86443]: Conroe Mai 07 19:30:43 devstack nova-compute[86443]: Cascadelake-Server-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Cascadelake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell Mai 07 19:30:43 devstack nova-compute[86443]: 486 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: file Mai 07 19:30:43 devstack nova-compute[86443]: anonymous Mai 07 19:30:43 devstack nova-compute[86443]: memfd Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: disk Mai 07 19:30:43 devstack nova-compute[86443]: cdrom Mai 07 19:30:43 devstack nova-compute[86443]: floppy Mai 07 19:30:43 devstack nova-compute[86443]: lun Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: fdc Mai 07 19:30:43 devstack nova-compute[86443]: scsi Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: sata Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: virtio-transitional Mai 07 19:30:43 devstack nova-compute[86443]: virtio-non-transitional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: sdl Mai 07 19:30:43 devstack nova-compute[86443]: vnc Mai 07 19:30:43 devstack nova-compute[86443]: spice Mai 07 19:30:43 devstack nova-compute[86443]: egl-headless Mai 07 19:30:43 devstack nova-compute[86443]: dbus Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: subsystem Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: default Mai 07 19:30:43 devstack nova-compute[86443]: mandatory Mai 07 19:30:43 devstack nova-compute[86443]: requisite Mai 07 19:30:43 devstack nova-compute[86443]: optional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: pci Mai 07 19:30:43 devstack nova-compute[86443]: scsi Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: virtio-transitional Mai 07 19:30:43 devstack nova-compute[86443]: virtio-non-transitional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: random Mai 07 19:30:43 devstack nova-compute[86443]: egd Mai 07 19:30:43 devstack nova-compute[86443]: builtin Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: path Mai 07 19:30:43 devstack nova-compute[86443]: handle Mai 07 19:30:43 devstack nova-compute[86443]: virtiofs Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: tpm-tis Mai 07 19:30:43 devstack nova-compute[86443]: tpm-crb Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: passthrough Mai 07 19:30:43 devstack nova-compute[86443]: emulator Mai 07 19:30:43 devstack nova-compute[86443]: external Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: 1.2 Mai 07 19:30:43 devstack nova-compute[86443]: 2.0 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: pty Mai 07 19:30:43 devstack nova-compute[86443]: unix Mai 07 19:30:43 devstack nova-compute[86443]: spicevmc Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: qemu Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: builtin Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: relaxed Mai 07 19:30:43 devstack nova-compute[86443]: vapic Mai 07 19:30:43 devstack nova-compute[86443]: spinlocks Mai 07 19:30:43 devstack nova-compute[86443]: vpindex Mai 07 19:30:43 devstack nova-compute[86443]: runtime Mai 07 19:30:43 devstack nova-compute[86443]: synic Mai 07 19:30:43 devstack nova-compute[86443]: stimer Mai 07 19:30:43 devstack nova-compute[86443]: reset Mai 07 19:30:43 devstack nova-compute[86443]: vendor_id Mai 07 19:30:43 devstack nova-compute[86443]: frequencies Mai 07 19:30:43 devstack nova-compute[86443]: reenlightenment Mai 07 19:30:43 devstack nova-compute[86443]: tlbflush Mai 07 19:30:43 devstack nova-compute[86443]: ipi Mai 07 19:30:43 devstack nova-compute[86443]: avic Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: {{(pid=86443) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1105}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for m68k via machine types: {None, 'virt'} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for microblaze via machine types: {None} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch microblaze / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-microblaze' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for microblazeel via machine types: {None} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch microblazeel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-microblazeel' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for mips via machine types: {None} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch mips / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-mips' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for mipsel via machine types: {None} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch mipsel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-mipsel' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for mips64 via machine types: {None} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch mips64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-mips64' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for mips64el via machine types: {None} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch mips64el / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-mips64el' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for ppc via machine types: {None} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch ppc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-ppc' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for ppc64 via machine types: {None, 'pseries', 'powernv'} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for ppc64le via machine types: {'pseries', 'powernv'} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for riscv32 via machine types: {None} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch riscv32 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-riscv32' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for riscv64 via machine types: {None} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch riscv64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-riscv64' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for s390x via machine types: {'s390-ccw-virtio'} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch s390x / virt_type kvm / machine_type s390-ccw-virtio: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-s390x' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for sh4 via machine types: {None} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch sh4 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-sh4' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for sh4eb via machine types: {None} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch sh4eb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-sh4eb' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for sparc via machine types: {None} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch sparc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-sparc' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for sparc64 via machine types: {None} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch sparc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-sparc64' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35', 'ubuntu', 'ubuntu-q35'} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: /usr/bin/qemu-system-x86_64 Mai 07 19:30:43 devstack nova-compute[86443]: kvm Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-noble Mai 07 19:30:43 devstack nova-compute[86443]: x86_64 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: efi Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/OVMF/OVMF_CODE_4M.fd Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: rom Mai 07 19:30:43 devstack nova-compute[86443]: pflash Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: yes Mai 07 19:30:43 devstack nova-compute[86443]: no Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: no Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: on Mai 07 19:30:43 devstack nova-compute[86443]: off Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: on Mai 07 19:30:43 devstack nova-compute[86443]: off Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-IBPB Mai 07 19:30:43 devstack nova-compute[86443]: AMD Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: qemu64 Mai 07 19:30:43 devstack nova-compute[86443]: qemu32 Mai 07 19:30:43 devstack nova-compute[86443]: phenom Mai 07 19:30:43 devstack nova-compute[86443]: pentium3 Mai 07 19:30:43 devstack nova-compute[86443]: pentium2 Mai 07 19:30:43 devstack nova-compute[86443]: pentium Mai 07 19:30:43 devstack nova-compute[86443]: n270 Mai 07 19:30:43 devstack nova-compute[86443]: kvm64 Mai 07 19:30:43 devstack nova-compute[86443]: kvm32 Mai 07 19:30:43 devstack nova-compute[86443]: coreduo Mai 07 19:30:43 devstack nova-compute[86443]: core2duo Mai 07 19:30:43 devstack nova-compute[86443]: athlon Mai 07 19:30:43 devstack nova-compute[86443]: Westmere-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Westmere Mai 07 19:30:43 devstack nova-compute[86443]: Snowridge Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client Mai 07 19:30:43 devstack nova-compute[86443]: SapphireRapids Mai 07 19:30:43 devstack nova-compute[86443]: SandyBridge-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: SandyBridge Mai 07 19:30:43 devstack nova-compute[86443]: Penryn Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G5 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G4 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G3 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G2 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G1 Mai 07 19:30:43 devstack nova-compute[86443]: Nehalem-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Nehalem Mai 07 19:30:43 devstack nova-compute[86443]: IvyBridge-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: IvyBridge Mai 07 19:30:43 devstack nova-compute[86443]: Icelake-Server-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Icelake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Haswell Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Rome Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Milan Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-IBPB Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Genoa Mai 07 19:30:43 devstack nova-compute[86443]: EPYC Mai 07 19:30:43 devstack nova-compute[86443]: Dhyana Mai 07 19:30:43 devstack nova-compute[86443]: Cooperlake Mai 07 19:30:43 devstack nova-compute[86443]: Conroe Mai 07 19:30:43 devstack nova-compute[86443]: Cascadelake-Server-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Cascadelake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell Mai 07 19:30:43 devstack nova-compute[86443]: 486 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: file Mai 07 19:30:43 devstack nova-compute[86443]: anonymous Mai 07 19:30:43 devstack nova-compute[86443]: memfd Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: disk Mai 07 19:30:43 devstack nova-compute[86443]: cdrom Mai 07 19:30:43 devstack nova-compute[86443]: floppy Mai 07 19:30:43 devstack nova-compute[86443]: lun Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: ide Mai 07 19:30:43 devstack nova-compute[86443]: fdc Mai 07 19:30:43 devstack nova-compute[86443]: scsi Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: sata Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: virtio-transitional Mai 07 19:30:43 devstack nova-compute[86443]: virtio-non-transitional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: sdl Mai 07 19:30:43 devstack nova-compute[86443]: vnc Mai 07 19:30:43 devstack nova-compute[86443]: spice Mai 07 19:30:43 devstack nova-compute[86443]: egl-headless Mai 07 19:30:43 devstack nova-compute[86443]: dbus Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: subsystem Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: default Mai 07 19:30:43 devstack nova-compute[86443]: mandatory Mai 07 19:30:43 devstack nova-compute[86443]: requisite Mai 07 19:30:43 devstack nova-compute[86443]: optional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: pci Mai 07 19:30:43 devstack nova-compute[86443]: scsi Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: virtio-transitional Mai 07 19:30:43 devstack nova-compute[86443]: virtio-non-transitional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: random Mai 07 19:30:43 devstack nova-compute[86443]: egd Mai 07 19:30:43 devstack nova-compute[86443]: builtin Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: path Mai 07 19:30:43 devstack nova-compute[86443]: handle Mai 07 19:30:43 devstack nova-compute[86443]: virtiofs Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: tpm-tis Mai 07 19:30:43 devstack nova-compute[86443]: tpm-crb Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: passthrough Mai 07 19:30:43 devstack nova-compute[86443]: emulator Mai 07 19:30:43 devstack nova-compute[86443]: external Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: 1.2 Mai 07 19:30:43 devstack nova-compute[86443]: 2.0 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: pty Mai 07 19:30:43 devstack nova-compute[86443]: unix Mai 07 19:30:43 devstack nova-compute[86443]: spicevmc Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: qemu Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: builtin Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: relaxed Mai 07 19:30:43 devstack nova-compute[86443]: vapic Mai 07 19:30:43 devstack nova-compute[86443]: spinlocks Mai 07 19:30:43 devstack nova-compute[86443]: vpindex Mai 07 19:30:43 devstack nova-compute[86443]: runtime Mai 07 19:30:43 devstack nova-compute[86443]: synic Mai 07 19:30:43 devstack nova-compute[86443]: stimer Mai 07 19:30:43 devstack nova-compute[86443]: reset Mai 07 19:30:43 devstack nova-compute[86443]: vendor_id Mai 07 19:30:43 devstack nova-compute[86443]: frequencies Mai 07 19:30:43 devstack nova-compute[86443]: reenlightenment Mai 07 19:30:43 devstack nova-compute[86443]: tlbflush Mai 07 19:30:43 devstack nova-compute[86443]: ipi Mai 07 19:30:43 devstack nova-compute[86443]: avic Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: {{(pid=86443) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1105}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: /usr/bin/qemu-system-x86_64 Mai 07 19:30:43 devstack nova-compute[86443]: kvm Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-noble Mai 07 19:30:43 devstack nova-compute[86443]: x86_64 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: efi Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/ovmf/OVMF.amdsev.fd Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/OVMF/OVMF_CODE_4M.fd Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: rom Mai 07 19:30:43 devstack nova-compute[86443]: pflash Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: yes Mai 07 19:30:43 devstack nova-compute[86443]: no Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: yes Mai 07 19:30:43 devstack nova-compute[86443]: no Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: on Mai 07 19:30:43 devstack nova-compute[86443]: off Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: on Mai 07 19:30:43 devstack nova-compute[86443]: off Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-IBPB Mai 07 19:30:43 devstack nova-compute[86443]: AMD Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: qemu64 Mai 07 19:30:43 devstack nova-compute[86443]: qemu32 Mai 07 19:30:43 devstack nova-compute[86443]: phenom Mai 07 19:30:43 devstack nova-compute[86443]: pentium3 Mai 07 19:30:43 devstack nova-compute[86443]: pentium2 Mai 07 19:30:43 devstack nova-compute[86443]: pentium Mai 07 19:30:43 devstack nova-compute[86443]: n270 Mai 07 19:30:43 devstack nova-compute[86443]: kvm64 Mai 07 19:30:43 devstack nova-compute[86443]: kvm32 Mai 07 19:30:43 devstack nova-compute[86443]: coreduo Mai 07 19:30:43 devstack nova-compute[86443]: core2duo Mai 07 19:30:43 devstack nova-compute[86443]: athlon Mai 07 19:30:43 devstack nova-compute[86443]: Westmere-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Westmere Mai 07 19:30:43 devstack nova-compute[86443]: Snowridge Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client Mai 07 19:30:43 devstack nova-compute[86443]: SapphireRapids Mai 07 19:30:43 devstack nova-compute[86443]: SandyBridge-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: SandyBridge Mai 07 19:30:43 devstack nova-compute[86443]: Penryn Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G5 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G4 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G3 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G2 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G1 Mai 07 19:30:43 devstack nova-compute[86443]: Nehalem-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Nehalem Mai 07 19:30:43 devstack nova-compute[86443]: IvyBridge-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: IvyBridge Mai 07 19:30:43 devstack nova-compute[86443]: Icelake-Server-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Icelake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Haswell Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Rome Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Milan Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-IBPB Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Genoa Mai 07 19:30:43 devstack nova-compute[86443]: EPYC Mai 07 19:30:43 devstack nova-compute[86443]: Dhyana Mai 07 19:30:43 devstack nova-compute[86443]: Cooperlake Mai 07 19:30:43 devstack nova-compute[86443]: Conroe Mai 07 19:30:43 devstack nova-compute[86443]: Cascadelake-Server-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Cascadelake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell Mai 07 19:30:43 devstack nova-compute[86443]: 486 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: file Mai 07 19:30:43 devstack nova-compute[86443]: anonymous Mai 07 19:30:43 devstack nova-compute[86443]: memfd Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: disk Mai 07 19:30:43 devstack nova-compute[86443]: cdrom Mai 07 19:30:43 devstack nova-compute[86443]: floppy Mai 07 19:30:43 devstack nova-compute[86443]: lun Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: fdc Mai 07 19:30:43 devstack nova-compute[86443]: scsi Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: sata Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: virtio-transitional Mai 07 19:30:43 devstack nova-compute[86443]: virtio-non-transitional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: sdl Mai 07 19:30:43 devstack nova-compute[86443]: vnc Mai 07 19:30:43 devstack nova-compute[86443]: spice Mai 07 19:30:43 devstack nova-compute[86443]: egl-headless Mai 07 19:30:43 devstack nova-compute[86443]: dbus Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: subsystem Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: default Mai 07 19:30:43 devstack nova-compute[86443]: mandatory Mai 07 19:30:43 devstack nova-compute[86443]: requisite Mai 07 19:30:43 devstack nova-compute[86443]: optional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: pci Mai 07 19:30:43 devstack nova-compute[86443]: scsi Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: virtio-transitional Mai 07 19:30:43 devstack nova-compute[86443]: virtio-non-transitional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: random Mai 07 19:30:43 devstack nova-compute[86443]: egd Mai 07 19:30:43 devstack nova-compute[86443]: builtin Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: path Mai 07 19:30:43 devstack nova-compute[86443]: handle Mai 07 19:30:43 devstack nova-compute[86443]: virtiofs Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: tpm-tis Mai 07 19:30:43 devstack nova-compute[86443]: tpm-crb Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: passthrough Mai 07 19:30:43 devstack nova-compute[86443]: emulator Mai 07 19:30:43 devstack nova-compute[86443]: external Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: 1.2 Mai 07 19:30:43 devstack nova-compute[86443]: 2.0 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: pty Mai 07 19:30:43 devstack nova-compute[86443]: unix Mai 07 19:30:43 devstack nova-compute[86443]: spicevmc Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: qemu Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: builtin Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: relaxed Mai 07 19:30:43 devstack nova-compute[86443]: vapic Mai 07 19:30:43 devstack nova-compute[86443]: spinlocks Mai 07 19:30:43 devstack nova-compute[86443]: vpindex Mai 07 19:30:43 devstack nova-compute[86443]: runtime Mai 07 19:30:43 devstack nova-compute[86443]: synic Mai 07 19:30:43 devstack nova-compute[86443]: stimer Mai 07 19:30:43 devstack nova-compute[86443]: reset Mai 07 19:30:43 devstack nova-compute[86443]: vendor_id Mai 07 19:30:43 devstack nova-compute[86443]: frequencies Mai 07 19:30:43 devstack nova-compute[86443]: reenlightenment Mai 07 19:30:43 devstack nova-compute[86443]: tlbflush Mai 07 19:30:43 devstack nova-compute[86443]: ipi Mai 07 19:30:43 devstack nova-compute[86443]: avic Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: {{(pid=86443) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1105}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: /usr/bin/qemu-system-x86_64 Mai 07 19:30:43 devstack nova-compute[86443]: kvm Mai 07 19:30:43 devstack nova-compute[86443]: pc-i440fx-noble-v2 Mai 07 19:30:43 devstack nova-compute[86443]: x86_64 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: efi Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/OVMF/OVMF_CODE_4M.fd Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: rom Mai 07 19:30:43 devstack nova-compute[86443]: pflash Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: yes Mai 07 19:30:43 devstack nova-compute[86443]: no Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: no Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: on Mai 07 19:30:43 devstack nova-compute[86443]: off Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: on Mai 07 19:30:43 devstack nova-compute[86443]: off Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-IBPB Mai 07 19:30:43 devstack nova-compute[86443]: AMD Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: qemu64 Mai 07 19:30:43 devstack nova-compute[86443]: qemu32 Mai 07 19:30:43 devstack nova-compute[86443]: phenom Mai 07 19:30:43 devstack nova-compute[86443]: pentium3 Mai 07 19:30:43 devstack nova-compute[86443]: pentium2 Mai 07 19:30:43 devstack nova-compute[86443]: pentium Mai 07 19:30:43 devstack nova-compute[86443]: n270 Mai 07 19:30:43 devstack nova-compute[86443]: kvm64 Mai 07 19:30:43 devstack nova-compute[86443]: kvm32 Mai 07 19:30:43 devstack nova-compute[86443]: coreduo Mai 07 19:30:43 devstack nova-compute[86443]: core2duo Mai 07 19:30:43 devstack nova-compute[86443]: athlon Mai 07 19:30:43 devstack nova-compute[86443]: Westmere-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Westmere Mai 07 19:30:43 devstack nova-compute[86443]: Snowridge Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client Mai 07 19:30:43 devstack nova-compute[86443]: SapphireRapids Mai 07 19:30:43 devstack nova-compute[86443]: SandyBridge-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: SandyBridge Mai 07 19:30:43 devstack nova-compute[86443]: Penryn Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G5 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G4 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G3 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G2 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G1 Mai 07 19:30:43 devstack nova-compute[86443]: Nehalem-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Nehalem Mai 07 19:30:43 devstack nova-compute[86443]: IvyBridge-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: IvyBridge Mai 07 19:30:43 devstack nova-compute[86443]: Icelake-Server-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Icelake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Haswell Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Rome Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Milan Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-IBPB Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Genoa Mai 07 19:30:43 devstack nova-compute[86443]: EPYC Mai 07 19:30:43 devstack nova-compute[86443]: Dhyana Mai 07 19:30:43 devstack nova-compute[86443]: Cooperlake Mai 07 19:30:43 devstack nova-compute[86443]: Conroe Mai 07 19:30:43 devstack nova-compute[86443]: Cascadelake-Server-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Cascadelake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell Mai 07 19:30:43 devstack nova-compute[86443]: 486 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: file Mai 07 19:30:43 devstack nova-compute[86443]: anonymous Mai 07 19:30:43 devstack nova-compute[86443]: memfd Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: disk Mai 07 19:30:43 devstack nova-compute[86443]: cdrom Mai 07 19:30:43 devstack nova-compute[86443]: floppy Mai 07 19:30:43 devstack nova-compute[86443]: lun Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: ide Mai 07 19:30:43 devstack nova-compute[86443]: fdc Mai 07 19:30:43 devstack nova-compute[86443]: scsi Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: sata Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: virtio-transitional Mai 07 19:30:43 devstack nova-compute[86443]: virtio-non-transitional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: sdl Mai 07 19:30:43 devstack nova-compute[86443]: vnc Mai 07 19:30:43 devstack nova-compute[86443]: spice Mai 07 19:30:43 devstack nova-compute[86443]: egl-headless Mai 07 19:30:43 devstack nova-compute[86443]: dbus Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: subsystem Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: default Mai 07 19:30:43 devstack nova-compute[86443]: mandatory Mai 07 19:30:43 devstack nova-compute[86443]: requisite Mai 07 19:30:43 devstack nova-compute[86443]: optional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: pci Mai 07 19:30:43 devstack nova-compute[86443]: scsi Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: virtio-transitional Mai 07 19:30:43 devstack nova-compute[86443]: virtio-non-transitional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: random Mai 07 19:30:43 devstack nova-compute[86443]: egd Mai 07 19:30:43 devstack nova-compute[86443]: builtin Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: path Mai 07 19:30:43 devstack nova-compute[86443]: handle Mai 07 19:30:43 devstack nova-compute[86443]: virtiofs Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: tpm-tis Mai 07 19:30:43 devstack nova-compute[86443]: tpm-crb Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: passthrough Mai 07 19:30:43 devstack nova-compute[86443]: emulator Mai 07 19:30:43 devstack nova-compute[86443]: external Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: 1.2 Mai 07 19:30:43 devstack nova-compute[86443]: 2.0 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: pty Mai 07 19:30:43 devstack nova-compute[86443]: unix Mai 07 19:30:43 devstack nova-compute[86443]: spicevmc Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: qemu Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: builtin Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: relaxed Mai 07 19:30:43 devstack nova-compute[86443]: vapic Mai 07 19:30:43 devstack nova-compute[86443]: spinlocks Mai 07 19:30:43 devstack nova-compute[86443]: vpindex Mai 07 19:30:43 devstack nova-compute[86443]: runtime Mai 07 19:30:43 devstack nova-compute[86443]: synic Mai 07 19:30:43 devstack nova-compute[86443]: stimer Mai 07 19:30:43 devstack nova-compute[86443]: reset Mai 07 19:30:43 devstack nova-compute[86443]: vendor_id Mai 07 19:30:43 devstack nova-compute[86443]: frequencies Mai 07 19:30:43 devstack nova-compute[86443]: reenlightenment Mai 07 19:30:43 devstack nova-compute[86443]: tlbflush Mai 07 19:30:43 devstack nova-compute[86443]: ipi Mai 07 19:30:43 devstack nova-compute[86443]: avic Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: {{(pid=86443) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1105}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu-q35: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: /usr/bin/qemu-system-x86_64 Mai 07 19:30:43 devstack nova-compute[86443]: kvm Mai 07 19:30:43 devstack nova-compute[86443]: pc-q35-noble-v2 Mai 07 19:30:43 devstack nova-compute[86443]: x86_64 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: efi Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/ovmf/OVMF.amdsev.fd Mai 07 19:30:43 devstack nova-compute[86443]: /usr/share/OVMF/OVMF_CODE_4M.fd Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: rom Mai 07 19:30:43 devstack nova-compute[86443]: pflash Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: yes Mai 07 19:30:43 devstack nova-compute[86443]: no Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: yes Mai 07 19:30:43 devstack nova-compute[86443]: no Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: on Mai 07 19:30:43 devstack nova-compute[86443]: off Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: on Mai 07 19:30:43 devstack nova-compute[86443]: off Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-IBPB Mai 07 19:30:43 devstack nova-compute[86443]: AMD Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: qemu64 Mai 07 19:30:43 devstack nova-compute[86443]: qemu32 Mai 07 19:30:43 devstack nova-compute[86443]: phenom Mai 07 19:30:43 devstack nova-compute[86443]: pentium3 Mai 07 19:30:43 devstack nova-compute[86443]: pentium2 Mai 07 19:30:43 devstack nova-compute[86443]: pentium Mai 07 19:30:43 devstack nova-compute[86443]: n270 Mai 07 19:30:43 devstack nova-compute[86443]: kvm64 Mai 07 19:30:43 devstack nova-compute[86443]: kvm32 Mai 07 19:30:43 devstack nova-compute[86443]: coreduo Mai 07 19:30:43 devstack nova-compute[86443]: core2duo Mai 07 19:30:43 devstack nova-compute[86443]: athlon Mai 07 19:30:43 devstack nova-compute[86443]: Westmere-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Westmere Mai 07 19:30:43 devstack nova-compute[86443]: Snowridge Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Skylake-Client Mai 07 19:30:43 devstack nova-compute[86443]: SapphireRapids Mai 07 19:30:43 devstack nova-compute[86443]: SandyBridge-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: SandyBridge Mai 07 19:30:43 devstack nova-compute[86443]: Penryn Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G5 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G4 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G3 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G2 Mai 07 19:30:43 devstack nova-compute[86443]: Opteron_G1 Mai 07 19:30:43 devstack nova-compute[86443]: Nehalem-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Nehalem Mai 07 19:30:43 devstack nova-compute[86443]: IvyBridge-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: IvyBridge Mai 07 19:30:43 devstack nova-compute[86443]: Icelake-Server-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Icelake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Haswell-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Haswell Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Rome Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Milan Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-IBPB Mai 07 19:30:43 devstack nova-compute[86443]: EPYC-Genoa Mai 07 19:30:43 devstack nova-compute[86443]: EPYC Mai 07 19:30:43 devstack nova-compute[86443]: Dhyana Mai 07 19:30:43 devstack nova-compute[86443]: Cooperlake Mai 07 19:30:43 devstack nova-compute[86443]: Conroe Mai 07 19:30:43 devstack nova-compute[86443]: Cascadelake-Server-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Cascadelake-Server Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-noTSX-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-noTSX Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell-IBRS Mai 07 19:30:43 devstack nova-compute[86443]: Broadwell Mai 07 19:30:43 devstack nova-compute[86443]: 486 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: file Mai 07 19:30:43 devstack nova-compute[86443]: anonymous Mai 07 19:30:43 devstack nova-compute[86443]: memfd Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: disk Mai 07 19:30:43 devstack nova-compute[86443]: cdrom Mai 07 19:30:43 devstack nova-compute[86443]: floppy Mai 07 19:30:43 devstack nova-compute[86443]: lun Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: fdc Mai 07 19:30:43 devstack nova-compute[86443]: scsi Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: sata Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: virtio-transitional Mai 07 19:30:43 devstack nova-compute[86443]: virtio-non-transitional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: sdl Mai 07 19:30:43 devstack nova-compute[86443]: vnc Mai 07 19:30:43 devstack nova-compute[86443]: spice Mai 07 19:30:43 devstack nova-compute[86443]: egl-headless Mai 07 19:30:43 devstack nova-compute[86443]: dbus Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: subsystem Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: default Mai 07 19:30:43 devstack nova-compute[86443]: mandatory Mai 07 19:30:43 devstack nova-compute[86443]: requisite Mai 07 19:30:43 devstack nova-compute[86443]: optional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: pci Mai 07 19:30:43 devstack nova-compute[86443]: scsi Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: virtio-transitional Mai 07 19:30:43 devstack nova-compute[86443]: virtio-non-transitional Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: random Mai 07 19:30:43 devstack nova-compute[86443]: egd Mai 07 19:30:43 devstack nova-compute[86443]: builtin Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: path Mai 07 19:30:43 devstack nova-compute[86443]: handle Mai 07 19:30:43 devstack nova-compute[86443]: virtiofs Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: tpm-tis Mai 07 19:30:43 devstack nova-compute[86443]: tpm-crb Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: passthrough Mai 07 19:30:43 devstack nova-compute[86443]: emulator Mai 07 19:30:43 devstack nova-compute[86443]: external Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: 1.2 Mai 07 19:30:43 devstack nova-compute[86443]: 2.0 Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: usb Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: pty Mai 07 19:30:43 devstack nova-compute[86443]: unix Mai 07 19:30:43 devstack nova-compute[86443]: spicevmc Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: virtio Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: qemu Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: builtin Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: relaxed Mai 07 19:30:43 devstack nova-compute[86443]: vapic Mai 07 19:30:43 devstack nova-compute[86443]: spinlocks Mai 07 19:30:43 devstack nova-compute[86443]: vpindex Mai 07 19:30:43 devstack nova-compute[86443]: runtime Mai 07 19:30:43 devstack nova-compute[86443]: synic Mai 07 19:30:43 devstack nova-compute[86443]: stimer Mai 07 19:30:43 devstack nova-compute[86443]: reset Mai 07 19:30:43 devstack nova-compute[86443]: vendor_id Mai 07 19:30:43 devstack nova-compute[86443]: frequencies Mai 07 19:30:43 devstack nova-compute[86443]: reenlightenment Mai 07 19:30:43 devstack nova-compute[86443]: tlbflush Mai 07 19:30:43 devstack nova-compute[86443]: ipi Mai 07 19:30:43 devstack nova-compute[86443]: avic Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: Mai 07 19:30:43 devstack nova-compute[86443]: {{(pid=86443) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1105}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for xtensa via machine types: {None} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch xtensa / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-xtensa' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Getting domain capabilities for xtensaeb via machine types: {None} {{(pid=86443) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:1020}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Error from libvirt when retrieving domain capabilities for arch xtensaeb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: the accel 'kvm' is not supported by '/usr/bin/qemu-system-xtensaeb' on this host {{(pid=86443) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1059}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Checking secure boot support for host arch (x86_64) {{(pid=86443) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1962}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Checking secure boot support for host arch (x86_64) {{(pid=86443) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1962}} Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Checking secure boot support for host arch (x86_64) {{(pid=86443) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1962}} Mai 07 19:30:43 devstack nova-compute[86443]: INFO nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Secure Boot support detected Mai 07 19:30:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.volume.mount [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Initialising _HostMountState generation 0 {{(pid=86443) host_up /opt/stack/nova/nova/virt/libvirt/volume/mount.py:130}} Mai 07 19:30:43 devstack nova-compute[86443]: INFO nova.virt.node [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Determined node identity cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 from /opt/stack/data/nova/compute_id Mai 07 19:30:43 devstack nova-compute[86443]: INFO nova.virt.node [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Determined node identity cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 from /opt/stack/data/nova/compute_id Mai 07 19:30:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Verified node cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 matches my host devstack {{(pid=86443) _check_for_host_rename /opt/stack/nova/nova/compute/manager.py:1712}} Mai 07 19:30:45 devstack nova-compute[86443]: INFO nova.compute.manager [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host Mai 07 19:30:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:30:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:30:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:30:45 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Auditing locally available compute resources for devstack (node: devstack) {{(pid=86443) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:937}} Mai 07 19:30:45 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:30:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Running cmd (subprocess): env LANG=C uptime {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:30:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] CMD "env LANG=C uptime" returned: 0 in 0.021s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:30:45 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Hypervisor/Node resource view: name=devstack free_ram=6326MB free_disk=14.976055145263672GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_00_0", "address": "0000:02:00.0", "product_id": "000d", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000d", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1111", "vendor_id": "1234", "numa_node": null, "label": "label_1234_1111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1043", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1043", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] {{(pid=86443) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1136}} Mai 07 19:30:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:30:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:30:47 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Total usable vcpus: 4, total allocated vcpus: 0 {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1159}} Mai 07 19:30:47 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Final resource view: name=devstack phys_ram=11961MB used_ram=512MB phys_disk=25GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:30:45 up 30 min, 1 user, load average: 2.52, 2.13, 1.74\n'} {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1168}} Mai 07 19:30:47 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Refreshing inventories for resource provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:822}} Mai 07 19:30:48 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Updating ProviderTree inventory for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:786}} Mai 07 19:30:48 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Updating inventory in ProviderTree for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Mai 07 19:30:48 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Refreshing aggregate associations for resource provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944, aggregates: None {{(pid=86443) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:831}} Mai 07 19:30:48 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Refreshing trait associations for resource provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944, traits: HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_MMX,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VMVGA,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_F16C,HW_CPU_X86_SSE2,HW_CPU_X86_BMI,COMPUTE_SOUND_MODEL_USB,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_ABM,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_QXL,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_SOUND_MODEL_AC97,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE,HW_ARCH_X86_64,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE {{(pid=86443) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:843}} Mai 07 19:30:48 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] /sys/module/kvm_amd/parameters/sev contains [N Mai 07 19:30:48 devstack nova-compute[86443]: ] {{(pid=86443) _kernel_supports_amd_sev /opt/stack/nova/nova/virt/libvirt/host.py:2038}} Mai 07 19:30:48 devstack nova-compute[86443]: INFO nova.virt.libvirt.host [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] kernel doesn't support AMD SEV Mai 07 19:30:48 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:30:48 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:30:48 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:30:48 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:30:49 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Compute_service record updated for devstack:devstack {{(pid=86443) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1097}} Mai 07 19:30:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.280s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:30:49 devstack nova-compute[86443]: DEBUG nova.service [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Creating RPC server for service: nova-compute on topic: compute {{(pid=86443) start /opt/stack/nova/nova/service.py:195}} Mai 07 19:30:49 devstack nova-compute[86443]: DEBUG nova.service [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Creating 2nd RPC server for service: nova-compute on topic: compute-alt {{(pid=86443) start /opt/stack/nova/nova/service.py:211}} Mai 07 19:30:49 devstack nova-compute[86443]: DEBUG nova.service [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] Join ServiceGroup membership for this service compute {{(pid=86443) start /opt/stack/nova/nova/service.py:221}} Mai 07 19:30:49 devstack nova-compute[86443]: DEBUG nova.servicegroup.drivers.db [None req-5a60c39b-58f6-410e-8b42-a60942481f47 None None] DB_Driver: join new ServiceGroup member devstack to the compute group, service = {{(pid=86443) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} Mai 07 19:30:54 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [-] Fixed interval looping call 'nova.servicegroup.drivers.db.DbDriver._report_state' sleeping for 119.49 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:31:21 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:31:21 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:31:22 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Starting instance... {{(pid=86443) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2605}} Mai 07 19:31:22 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:31:22 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:31:22 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=86443) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2630}} Mai 07 19:31:22 devstack nova-compute[86443]: INFO nova.compute.claims [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Claim successful on node devstack Mai 07 19:31:23 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:31:23 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:31:24 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:31:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.142s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:31:24 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Start building networks asynchronously for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3003}} Mai 07 19:31:25 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Allocating IP information in the background. {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2148}} Mai 07 19:31:25 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] allocate_for_instance() {{(pid=86443) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1187}} Mai 07 19:31:25 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:31:25 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:31:26 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Mai 07 19:31:26 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Start building block device mappings for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3038}} Mai 07 19:31:26 devstack nova-compute[86443]: DEBUG nova.policy [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '47c66181a1ab44acb74977f56e0f3ca3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '573feae7be914c46a57ac9575b2f8e5f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=86443) authorize /opt/stack/nova/nova/policy.py:192}} Mai 07 19:31:27 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._sync_power_states {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:31:27 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Start spawning the instance on the hypervisor. {{(pid=86443) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2811}} Mai 07 19:31:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Creating instance directory {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5215}} Mai 07 19:31:27 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Creating image(s) Mai 07 19:31:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "/opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:31:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "/opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:31:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "/opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:31:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "d8d56ca44922efe85609619d01052c20f44c056a" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:31:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:31:27 devstack nova-compute[86443]: WARNING nova.compute.manager [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor. Mai 07 19:31:27 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Triggering sync for uuid fbda5d5e-085d-499a-9b6c-5e8e388d5363 {{(pid=86443) _sync_power_states /opt/stack/nova/nova/compute/manager.py:11194}} Mai 07 19:31:27 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:31:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:31:27 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 12.24 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:31:28 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:31:28 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:31:28 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a.part --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:31:28 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a.part --force-share --output=json" returned: 0 in 0.135s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:31:28 devstack nova-compute[86443]: DEBUG nova.virt.images [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] e8633b10-b98a-4580-90f8-3091ca40fa29 was qcow2, converting to raw {{(pid=86443) fetch_to_raw /opt/stack/nova/nova/virt/images.py:278}} Mai 07 19:31:28 devstack nova-compute[86443]: DEBUG nova.privsep.utils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=86443) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Mai 07 19:31:28 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a.part /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a.converted {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:31:28 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a.part /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a.converted" returned: 0 in 0.234s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:31:28 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a.converted --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:31:28 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a.converted --force-share --output=json" returned: 0 in 0.151s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:31:28 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.254s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:31:28 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:31:28 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:31:28 devstack nova-compute[86443]: INFO oslo.privsep.daemon [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpchbdpwh3/privsep.sock'] Mai 07 19:31:28 devstack sudo[86985]: quobyte : PWD=/ ; USER=root ; COMMAND=/opt/stack/data/venv/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context nova.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpchbdpwh3/privsep.sock Mai 07 19:31:28 devstack sudo[86985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000) Mai 07 19:31:30 devstack sudo[86985]: pam_unix(sudo:session): session closed for user root Mai 07 19:31:30 devstack nova-compute[86443]: INFO oslo.privsep.daemon [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Spawned new privsep daemon via rootwrap Mai 07 19:31:30 devstack nova-compute[86443]: INFO oslo.privsep.daemon [-] privsep daemon starting Mai 07 19:31:30 devstack nova-compute[86443]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Mai 07 19:31:30 devstack nova-compute[86443]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none Mai 07 19:31:30 devstack nova-compute[86443]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 86988 Mai 07 19:31:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:31:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.153s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:31:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "d8d56ca44922efe85609619d01052c20f44c056a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:31:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:31:30 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:31:30 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:31:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:31:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.132s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:31:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk 1073741824 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:31:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk 1073741824" returned: 0 in 0.052s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:31:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.203s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:31:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:31:31 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.130s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:31:31 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Checking if we can resize image /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk. size=1073741824 {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:178}} Mai 07 19:31:31 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:31:31 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk --force-share --output=json" returned: 0 in 0.115s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:31:31 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Cannot resize image /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk to a smaller size. {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:184}} Mai 07 19:31:31 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Created local disks {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5347}} Mai 07 19:31:31 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Ensure instance console log exists: /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/console.log {{(pid=86443) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:5094}} Mai 07 19:31:31 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:31:31 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:31:31 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:31:34 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Successfully created port: 4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:529}} Mai 07 19:31:35 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Successfully updated port: 4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) _update_port /opt/stack/nova/nova/network/neutron.py:567}} Mai 07 19:31:35 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-a252a4a8-c286-4815-a7fc-08d8afc8a175 req-2b588c22-9498-4b04-a7fa-fc19de755bc1 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-changed-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:31:35 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-a252a4a8-c286-4815-a7fc-08d8afc8a175 req-2b588c22-9498-4b04-a7fa-fc19de755bc1 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Refreshing instance network info cache due to event network-changed-4dc7dc14-7c9d-468b-941a-a17ba2f0390b. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:31:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-a252a4a8-c286-4815-a7fc-08d8afc8a175 req-2b588c22-9498-4b04-a7fa-fc19de755bc1 service nova] Acquiring lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:31:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-a252a4a8-c286-4815-a7fc-08d8afc8a175 req-2b588c22-9498-4b04-a7fa-fc19de755bc1 service nova] Acquired lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:31:35 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-a252a4a8-c286-4815-a7fc-08d8afc8a175 req-2b588c22-9498-4b04-a7fa-fc19de755bc1 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Refreshing network info cache for port 4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:31:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:31:36 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-a252a4a8-c286-4815-a7fc-08d8afc8a175 req-2b588c22-9498-4b04-a7fa-fc19de755bc1 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:31:36 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-a252a4a8-c286-4815-a7fc-08d8afc8a175 req-2b588c22-9498-4b04-a7fa-fc19de755bc1 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:31:36 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-a252a4a8-c286-4815-a7fc-08d8afc8a175 req-2b588c22-9498-4b04-a7fa-fc19de755bc1 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:31:37 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-a252a4a8-c286-4815-a7fc-08d8afc8a175 req-2b588c22-9498-4b04-a7fa-fc19de755bc1 service nova] Releasing lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:31:37 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquired lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:31:37 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:31:37 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:31:38 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-b2c128d0-1c5d-489e-9894-416a0ef7e82a req-75502f7d-1a3c-4628-aaaa-5c80dd46121b service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:31:38 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-b2c128d0-1c5d-489e-9894-416a0ef7e82a req-75502f7d-1a3c-4628-aaaa-5c80dd46121b service nova] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:31:38 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-b2c128d0-1c5d-489e-9894-416a0ef7e82a req-75502f7d-1a3c-4628-aaaa-5c80dd46121b service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:31:38 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-b2c128d0-1c5d-489e-9894-416a0ef7e82a req-75502f7d-1a3c-4628-aaaa-5c80dd46121b service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:31:38 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-b2c128d0-1c5d-489e-9894-416a0ef7e82a req-75502f7d-1a3c-4628-aaaa-5c80dd46121b service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] No waiting events found dispatching network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:31:38 devstack nova-compute[86443]: WARNING nova.compute.manager [req-b2c128d0-1c5d-489e-9894-416a0ef7e82a req-75502f7d-1a3c-4628-aaaa-5c80dd46121b service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received unexpected event network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b for instance with vm_state building and task_state spawning. Mai 07 19:31:38 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Updating instance_info_cache with network_info: [{"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Releasing lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Instance network_info: |[{"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2163}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Start _get_guest_xml network_info=[{"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'image_id': 'e8633b10-b98a-4580-90f8-3091ca40fa29'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8192}} Mai 07 19:31:39 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='e8633b10-b98a-4580-90f8-3091ca40fa29', instance_meta=NovaInstanceMeta(name='tempest-ServerStableDeviceRescueTest-server-1930754072', uuid='fbda5d5e-085d-499a-9b6c-5e8e388d5363'), owner=OwnerMeta(userid='47c66181a1ab44acb74977f56e0f3ca3', username='tempest-ServerStableDeviceRescueTest-2036085738-project-member', projectid='573feae7be914c46a57ac9575b2f8e5f', projectname='tempest-ServerStableDeviceRescueTest-2036085738'), image=ImageMeta(id='e8633b10-b98a-4580-90f8-3091ca40fa29', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175099.6459682) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Searching host: 'devstack' for CPU controller through CGroups V1... {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1783}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CPU controller missing on host. {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1793}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Searching host: 'devstack' for CPU controller through CGroups V2... {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1802}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CPU controller found on host. {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1809}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Getting desirable topologies for flavor Flavor(created_at=2026-05-07T17:26:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=192,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=), allow threads: True {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:703}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Flavor limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:488}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Image limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:492}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Flavor pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:528}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Image pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:532}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:570}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:709}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:611}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Got 1 possible topologies {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:641}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:715}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:717}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.privsep.utils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=86443) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:31:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1930754072',display_name='tempest-ServerStableDeviceRescueTest-server-1930754072',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-serverstabledevicerescuetest-server-1930754072',id=1,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA0/3BWn/zzrBQnAkXdd1s2YFde8yarhfzAlXsa7tLa/iX4wVP25675chgq0wVKE1yt1vGqpfarn8QigLrDIoYheGujk2mCEoyKPhrMw8JZOvM12TXRkJHFDuOg8jZSc3g==',key_name='tempest-keypair-1700510776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='573feae7be914c46a57ac9575b2f8e5f',ramdisk_id='',reservation_id='r-lwlutd2e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-2036085738',owner_user_name='tempest-ServerStableDeviceRescueTest-2036085738-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:31:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='47c66181a1ab44acb74977f56e0f3ca3',uuid=fbda5d5e-085d-499a-9b6c-5e8e388d5363,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Converting VIF {"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ca:f2:d2,bridge_name='br-int',has_traffic_filtering=True,id=4dc7dc14-7c9d-468b-941a-a17ba2f0390b,network=Network(2dbfb390-0203-4c02-95b8-b0404f525eaa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dc7dc14-7c') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:31:39 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lazy-loading 'pci_devices' on Instance uuid fbda5d5e-085d-499a-9b6c-5e8e388d5363 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=86443) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:11402}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager.update_available_resource {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] End _get_guest_xml xml= Mai 07 19:31:40 devstack nova-compute[86443]: fbda5d5e-085d-499a-9b6c-5e8e388d5363 Mai 07 19:31:40 devstack nova-compute[86443]: instance-00000001 Mai 07 19:31:40 devstack nova-compute[86443]: 196608 Mai 07 19:31:40 devstack nova-compute[86443]: 1 Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: tempest-ServerStableDeviceRescueTest-server-1930754072 Mai 07 19:31:40 devstack nova-compute[86443]: 2026-05-07 17:31:39 Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: 192 Mai 07 19:31:40 devstack nova-compute[86443]: 1 Mai 07 19:31:40 devstack nova-compute[86443]: 0 Mai 07 19:31:40 devstack nova-compute[86443]: 0 Mai 07 19:31:40 devstack nova-compute[86443]: 1 Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: True Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: bare Mai 07 19:31:40 devstack nova-compute[86443]: qcow2 Mai 07 19:31:40 devstack nova-compute[86443]: 1 Mai 07 19:31:40 devstack nova-compute[86443]: 0 Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: virtio Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: tempest-ServerStableDeviceRescueTest-2036085738-project-member Mai 07 19:31:40 devstack nova-compute[86443]: tempest-ServerStableDeviceRescueTest-2036085738 Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: OpenStack Foundation Mai 07 19:31:40 devstack nova-compute[86443]: OpenStack Nova Mai 07 19:31:40 devstack nova-compute[86443]: 33.1.0 Mai 07 19:31:40 devstack nova-compute[86443]: fbda5d5e-085d-499a-9b6c-5e8e388d5363 Mai 07 19:31:40 devstack nova-compute[86443]: fbda5d5e-085d-499a-9b6c-5e8e388d5363 Mai 07 19:31:40 devstack nova-compute[86443]: Virtual Machine Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: hvm Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: 1 Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: /dev/urandom Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: Mai 07 19:31:40 devstack nova-compute[86443]: {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8199}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Preparing to wait for external event network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:306}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:31:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1930754072',display_name='tempest-ServerStableDeviceRescueTest-server-1930754072',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-serverstabledevicerescuetest-server-1930754072',id=1,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA0/3BWn/zzrBQnAkXdd1s2YFde8yarhfzAlXsa7tLa/iX4wVP25675chgq0wVKE1yt1vGqpfarn8QigLrDIoYheGujk2mCEoyKPhrMw8JZOvM12TXRkJHFDuOg8jZSc3g==',key_name='tempest-keypair-1700510776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='573feae7be914c46a57ac9575b2f8e5f',ramdisk_id='',reservation_id='r-lwlutd2e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-2036085738',owner_user_name='tempest-ServerStableDeviceRescueTest-2036085738-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:31:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='47c66181a1ab44acb74977f56e0f3ca3',uuid=fbda5d5e-085d-499a-9b6c-5e8e388d5363,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) plug /opt/stack/nova/nova/virt/libvirt/vif.py:763}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Converting VIF {"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ca:f2:d2,bridge_name='br-int',has_traffic_filtering=True,id=4dc7dc14-7c9d-468b-941a-a17ba2f0390b,network=Network(2dbfb390-0203-4c02-95b8-b0404f525eaa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dc7dc14-7c') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG os_vif [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:f2:d2,bridge_name='br-int',has_traffic_filtering=True,id=4dc7dc14-7c9d-468b-941a-a17ba2f0390b,network=Network(2dbfb390-0203-4c02-95b8-b0404f525eaa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dc7dc14-7c') {{(pid=86443) plug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:76}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Created schema index Interface.name {{(pid=86443) autocreate_indices /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Created schema index Port.name {{(pid=86443) autocreate_indices /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Created schema index Bridge.name {{(pid=86443) autocreate_indices /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] tcp:127.0.0.1:6640: entering CONNECTING {{(pid=86443) _transition /opt/stack/data/venv/lib/python3.12/site-packages/ovs/reconnect.py:519}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [POLLOUT] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=86443) _transition /opt/stack/data/venv/lib/python3.12/site-packages/ovs/reconnect.py:519}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'bb4364f8-d7e4-5f58-9b85-25f0973d4e7e', '_type': 'linux-noop'}}, row=False) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:40 devstack nova-compute[86443]: INFO oslo.privsep.daemon [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp1hqatd87/privsep.sock'] Mai 07 19:31:40 devstack sudo[87033]: quobyte : PWD=/ ; USER=root ; COMMAND=/opt/stack/data/venv/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmp1hqatd87/privsep.sock Mai 07 19:31:40 devstack sudo[87033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000) Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Auditing locally available compute resources for devstack (node: devstack) {{(pid=86443) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:937}} Mai 07 19:31:40 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): env LANG=C uptime {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "env LANG=C uptime" returned: 0 in 0.024s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Hypervisor/Node resource view: name=devstack free_ram=5987MB free_disk=14.926319122314453GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_00_0", "address": "0000:02:00.0", "product_id": "000d", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000d", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1111", "vendor_id": "1234", "numa_node": null, "label": "label_1234_1111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1043", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1043", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] {{(pid=86443) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1136}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:31:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:31:41 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Instance fbda5d5e-085d-499a-9b6c-5e8e388d5363 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=86443) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1740}} Mai 07 19:31:41 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Total usable vcpus: 4, total allocated vcpus: 1 {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1159}} Mai 07 19:31:41 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Final resource view: name=devstack phys_ram=11961MB used_ram=704MB phys_disk=25GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:31:40 up 31 min, 1 user, load average: 3.53, 2.47, 1.88\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_573feae7be914c46a57ac9575b2f8e5f': '1', 'io_workload': '1'} {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1168}} Mai 07 19:31:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:41 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:31:41 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:31:41 devstack sudo[87033]: pam_unix(sudo:session): session closed for user root Mai 07 19:31:41 devstack nova-compute[86443]: INFO oslo.privsep.daemon [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Spawned new privsep daemon via rootwrap Mai 07 19:31:41 devstack nova-compute[86443]: INFO oslo.privsep.daemon [-] privsep daemon starting Mai 07 19:31:41 devstack nova-compute[86443]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Mai 07 19:31:41 devstack nova-compute[86443]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none Mai 07 19:31:41 devstack nova-compute[86443]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 87041 Mai 07 19:31:42 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:31:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4dc7dc14-7c, may_exist=True, interface_attrs={}) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:31:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap4dc7dc14-7c, col_values=(('qos', UUID('87cb7963-5e3f-490b-9fe3-9e46b812bc5f')),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:31:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap4dc7dc14-7c, col_values=(('external_ids', {'iface-id': '4dc7dc14-7c9d-468b-941a-a17ba2f0390b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ca:f2:d2', 'vm-uuid': 'fbda5d5e-085d-499a-9b6c-5e8e388d5363'}),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:31:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:31:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:42 devstack nova-compute[86443]: INFO os_vif [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:f2:d2,bridge_name='br-int',has_traffic_filtering=True,id=4dc7dc14-7c9d-468b-941a-a17ba2f0390b,network=Network(2dbfb390-0203-4c02-95b8-b0404f525eaa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dc7dc14-7c') Mai 07 19:31:42 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Compute_service record updated for devstack:devstack {{(pid=86443) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1097}} Mai 07 19:31:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.227s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:31:42 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 59.98 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:31:44 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:31:44 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] No VIF found with MAC fa:16:3e:ca:f2:d2, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:31:44 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:44 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:44 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:44 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-8465905b-4f19-4fd5-b8c2-b7cb209ed103 req-2e0e886e-c2e5-4afc-9b1a-e15acdd4da4b service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:31:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-8465905b-4f19-4fd5-b8c2-b7cb209ed103 req-2e0e886e-c2e5-4afc-9b1a-e15acdd4da4b service nova] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:31:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-8465905b-4f19-4fd5-b8c2-b7cb209ed103 req-2e0e886e-c2e5-4afc-9b1a-e15acdd4da4b service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.005s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:31:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-8465905b-4f19-4fd5-b8c2-b7cb209ed103 req-2e0e886e-c2e5-4afc-9b1a-e15acdd4da4b service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:31:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-8465905b-4f19-4fd5-b8c2-b7cb209ed103 req-2e0e886e-c2e5-4afc-9b1a-e15acdd4da4b service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Processing event network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11746}} Mai 07 19:31:45 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=86443) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:601}} Mai 07 19:31:45 devstack nova-compute[86443]: DEBUG nova.virt.driver [-] Emitting event Started> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:31:45 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] VM Started (Lifecycle Event) Mai 07 19:31:45 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Guest created on hypervisor {{(pid=86443) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4893}} Mai 07 19:31:45 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Instance spawned successfully. Mai 07 19:31:45 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1012}} Mai 07 19:31:46 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:31:46 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:31:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Found default for hw_cdrom_bus of ide {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:31:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Found default for hw_disk_bus of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:31:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Found default for hw_input_bus of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:31:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Found default for hw_pointer_model of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:31:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Found default for hw_video_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:31:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Found default for hw_vif_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:31:46 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:46 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] During sync_power_state the instance has a pending task (spawning). Skip. Mai 07 19:31:46 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Paused> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:31:46 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] VM Paused (Lifecycle Event) Mai 07 19:31:46 devstack nova-compute[86443]: INFO nova.compute.manager [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Took 19.09 seconds to spawn the instance on the hypervisor. Mai 07 19:31:46 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:31:46 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-6436e7af-a2c5-4c5e-bf1a-09229ae7f0b3 req-0309d2ee-1a63-4cc3-9a60-3428607435c1 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:31:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-6436e7af-a2c5-4c5e-bf1a-09229ae7f0b3 req-0309d2ee-1a63-4cc3-9a60-3428607435c1 service nova] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:31:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-6436e7af-a2c5-4c5e-bf1a-09229ae7f0b3 req-0309d2ee-1a63-4cc3-9a60-3428607435c1 service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:31:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-6436e7af-a2c5-4c5e-bf1a-09229ae7f0b3 req-0309d2ee-1a63-4cc3-9a60-3428607435c1 service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:31:46 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-6436e7af-a2c5-4c5e-bf1a-09229ae7f0b3 req-0309d2ee-1a63-4cc3-9a60-3428607435c1 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] No waiting events found dispatching network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:31:46 devstack nova-compute[86443]: WARNING nova.compute.manager [req-6436e7af-a2c5-4c5e-bf1a-09229ae7f0b3 req-0309d2ee-1a63-4cc3-9a60-3428607435c1 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received unexpected event network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b for instance with vm_state active and task_state None. Mai 07 19:31:47 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:31:47 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Resumed> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:31:47 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] VM Resumed (Lifecycle Event) Mai 07 19:31:47 devstack nova-compute[86443]: INFO nova.compute.manager [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Took 24.41 seconds to build instance. Mai 07 19:31:47 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:47 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:31:47 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:31:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bb6688f-aa9a-4011-906e-86aebcf200d1 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 25.939s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:31:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 19.843s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:31:47 devstack nova-compute[86443]: INFO nova.compute.manager [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] During sync_power_state the instance has a pending task (spawning). Skip. Mai 07 19:31:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:31:47 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:52 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:52 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:52 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-398bf3ed-02b8-42b4-a9a8-69704bc0008d req-19635cd4-ab16-440e-b6a2-98f60a72ca91 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-changed-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:31:52 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-398bf3ed-02b8-42b4-a9a8-69704bc0008d req-19635cd4-ab16-440e-b6a2-98f60a72ca91 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Refreshing instance network info cache due to event network-changed-4dc7dc14-7c9d-468b-941a-a17ba2f0390b. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:31:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-398bf3ed-02b8-42b4-a9a8-69704bc0008d req-19635cd4-ab16-440e-b6a2-98f60a72ca91 service nova] Acquiring lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:31:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-398bf3ed-02b8-42b4-a9a8-69704bc0008d req-19635cd4-ab16-440e-b6a2-98f60a72ca91 service nova] Acquired lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:31:52 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-398bf3ed-02b8-42b4-a9a8-69704bc0008d req-19635cd4-ab16-440e-b6a2-98f60a72ca91 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Refreshing network info cache for port 4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:31:52 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:52 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:52 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:52 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:53 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-398bf3ed-02b8-42b4-a9a8-69704bc0008d req-19635cd4-ab16-440e-b6a2-98f60a72ca91 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:31:53 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-398bf3ed-02b8-42b4-a9a8-69704bc0008d req-19635cd4-ab16-440e-b6a2-98f60a72ca91 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:31:54 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-398bf3ed-02b8-42b4-a9a8-69704bc0008d req-19635cd4-ab16-440e-b6a2-98f60a72ca91 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Updated VIF entry in instance network info cache for port 4dc7dc14-7c9d-468b-941a-a17ba2f0390b. {{(pid=86443) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3521}} Mai 07 19:31:54 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-398bf3ed-02b8-42b4-a9a8-69704bc0008d req-19635cd4-ab16-440e-b6a2-98f60a72ca91 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Updating instance_info_cache with network_info: [{"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.153", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:31:54 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-398bf3ed-02b8-42b4-a9a8-69704bc0008d req-19635cd4-ab16-440e-b6a2-98f60a72ca91 service nova] Releasing lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:31:56 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:57 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Acquiring lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:31:57 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:31:57 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:31:57 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Starting instance... {{(pid=86443) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2605}} Mai 07 19:31:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:31:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:31:58 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=86443) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2630}} Mai 07 19:31:58 devstack nova-compute[86443]: INFO nova.compute.claims [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Claim successful on node devstack Mai 07 19:31:59 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:31:59 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:31:59 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:32:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.174s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:00 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Start building networks asynchronously for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3003}} Mai 07 19:32:00 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Allocating IP information in the background. {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2148}} Mai 07 19:32:00 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] allocate_for_instance() {{(pid=86443) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1187}} Mai 07 19:32:00 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:00 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:00 devstack nova-compute[86443]: DEBUG nova.policy [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f3230ea9529545f48fe6fafdb05a882b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ff7563f2fd6d456ca5a6feffe8a32082', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=86443) authorize /opt/stack/nova/nova/policy.py:192}} Mai 07 19:32:01 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Mai 07 19:32:01 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:01 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Start building block device mappings for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3038}} Mai 07 19:32:02 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:02 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:02 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Start spawning the instance on the hypervisor. {{(pid=86443) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2811}} Mai 07 19:32:02 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Creating instance directory {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5215}} Mai 07 19:32:02 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Creating image(s) Mai 07 19:32:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Acquiring lock "/opt/stack/data/nova/instances/7f0ddc97-b43e-4a5a-8690-7583f53320f0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "/opt/stack/data/nova/instances/7f0ddc97-b43e-4a5a-8690-7583f53320f0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "/opt/stack/data/nova/instances/7f0ddc97-b43e-4a5a-8690-7583f53320f0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.004s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:02 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:32:02 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:32:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.124s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Acquiring lock "d8d56ca44922efe85609619d01052c20f44c056a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Successfully created port: ffa25775-a078-439b-bc5a-4aa8ef9b3b2b {{(pid=86443) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:529}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.180s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/7f0ddc97-b43e-4a5a-8690-7583f53320f0/disk 1073741824 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/7f0ddc97-b43e-4a5a-8690-7583f53320f0/disk 1073741824" returned: 0 in 0.069s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.271s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.135s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Checking if we can resize image /opt/stack/data/nova/instances/7f0ddc97-b43e-4a5a-8690-7583f53320f0/disk. size=1073741824 {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:178}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7f0ddc97-b43e-4a5a-8690-7583f53320f0/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7f0ddc97-b43e-4a5a-8690-7583f53320f0/disk --force-share --output=json" returned: 0 in 0.118s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Cannot resize image /opt/stack/data/nova/instances/7f0ddc97-b43e-4a5a-8690-7583f53320f0/disk to a smaller size. {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:184}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Created local disks {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5347}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Ensure instance console log exists: /opt/stack/data/nova/instances/7f0ddc97-b43e-4a5a-8690-7583f53320f0/console.log {{(pid=86443) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:5094}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:05 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Successfully updated port: ffa25775-a078-439b-bc5a-4aa8ef9b3b2b {{(pid=86443) _update_port /opt/stack/nova/nova/network/neutron.py:567}} Mai 07 19:32:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:06 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Acquiring lock "refresh_cache-7f0ddc97-b43e-4a5a-8690-7583f53320f0" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:32:06 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Acquired lock "refresh_cache-7f0ddc97-b43e-4a5a-8690-7583f53320f0" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:32:06 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:32:07 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-f6c278fd-c6a8-4976-972d-9625e4a8a279 req-f0165f85-3012-426b-8b54-41546128309e service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Received event network-changed-ffa25775-a078-439b-bc5a-4aa8ef9b3b2b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:32:07 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-f6c278fd-c6a8-4976-972d-9625e4a8a279 req-f0165f85-3012-426b-8b54-41546128309e service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Refreshing instance network info cache due to event network-changed-ffa25775-a078-439b-bc5a-4aa8ef9b3b2b. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:32:07 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-f6c278fd-c6a8-4976-972d-9625e4a8a279 req-f0165f85-3012-426b-8b54-41546128309e service nova] Acquiring lock "refresh_cache-7f0ddc97-b43e-4a5a-8690-7583f53320f0" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:32:07 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:32:07 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:08 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:09 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Updating instance_info_cache with network_info: [{"id": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "address": "fa:16:3e:60:f7:a7", "network": {"id": "727cabe8-9053-4d06-9f7e-70dbb412fab9", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-375743393-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff7563f2fd6d456ca5a6feffe8a32082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffa25775-a0", "ovs_interfaceid": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:32:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Releasing lock "refresh_cache-7f0ddc97-b43e-4a5a-8690-7583f53320f0" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Instance network_info: |[{"id": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "address": "fa:16:3e:60:f7:a7", "network": {"id": "727cabe8-9053-4d06-9f7e-70dbb412fab9", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-375743393-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff7563f2fd6d456ca5a6feffe8a32082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffa25775-a0", "ovs_interfaceid": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2163}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-f6c278fd-c6a8-4976-972d-9625e4a8a279 req-f0165f85-3012-426b-8b54-41546128309e service nova] Acquired lock "refresh_cache-7f0ddc97-b43e-4a5a-8690-7583f53320f0" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-f6c278fd-c6a8-4976-972d-9625e4a8a279 req-f0165f85-3012-426b-8b54-41546128309e service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Refreshing network info cache for port ffa25775-a078-439b-bc5a-4aa8ef9b3b2b {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Start _get_guest_xml network_info=[{"id": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "address": "fa:16:3e:60:f7:a7", "network": {"id": "727cabe8-9053-4d06-9f7e-70dbb412fab9", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-375743393-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff7563f2fd6d456ca5a6feffe8a32082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffa25775-a0", "ovs_interfaceid": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'image_id': 'e8633b10-b98a-4580-90f8-3091ca40fa29'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8192}} Mai 07 19:32:10 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='e8633b10-b98a-4580-90f8-3091ca40fa29', instance_meta=NovaInstanceMeta(name='tempest-VolumesAssistedSnapshotsTest-server-367025004', uuid='7f0ddc97-b43e-4a5a-8690-7583f53320f0'), owner=OwnerMeta(userid='f3230ea9529545f48fe6fafdb05a882b', username='tempest-VolumesAssistedSnapshotsTest-937399013-project-member', projectid='ff7563f2fd6d456ca5a6feffe8a32082', projectname='tempest-VolumesAssistedSnapshotsTest-937399013'), image=ImageMeta(id='e8633b10-b98a-4580-90f8-3091ca40fa29', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "address": "fa:16:3e:60:f7:a7", "network": {"id": "727cabe8-9053-4d06-9f7e-70dbb412fab9", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-375743393-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff7563f2fd6d456ca5a6feffe8a32082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffa25775-a0", "ovs_interfaceid": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175130.0246894) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Searching host: 'devstack' for CPU controller through CGroups V1... {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1783}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] CPU controller missing on host. {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1793}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Searching host: 'devstack' for CPU controller through CGroups V2... {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1802}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] CPU controller found on host. {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1809}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Getting desirable topologies for flavor Flavor(created_at=2026-05-07T17:26:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=192,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=), allow threads: True {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:703}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Flavor limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:488}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Image limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:492}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Flavor pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:528}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Image pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:532}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:570}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:709}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:611}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Got 1 possible topologies {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:641}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:715}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:717}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:31:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAssistedSnapshotsTest-server-367025004',display_name='tempest-VolumesAssistedSnapshotsTest-server-367025004',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-volumesassistedsnapshotstest-server-367025004',id=2,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO4w5hPBxKmd3vLt1iJD44reCQ0uqFrZUztfteqXGJNl11aaESOInYHsQrEQQB/y1JeJv2kdN3MUlgyyHmeLwK31PKytWfSeRKn35rXXESwYXLSIqlni7xjR7PTsUT6ptA==',key_name='tempest-keypair-1707075065',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ff7563f2fd6d456ca5a6feffe8a32082',ramdisk_id='',reservation_id='r-c5m0tai2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAssistedSnapshotsTest-937399013',owner_user_name='tempest-VolumesAssistedSnapshotsTest-937399013-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:32:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='f3230ea9529545f48fe6fafdb05a882b',uuid=7f0ddc97-b43e-4a5a-8690-7583f53320f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "address": "fa:16:3e:60:f7:a7", "network": {"id": "727cabe8-9053-4d06-9f7e-70dbb412fab9", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-375743393-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff7563f2fd6d456ca5a6feffe8a32082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffa25775-a0", "ovs_interfaceid": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Converting VIF {"id": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "address": "fa:16:3e:60:f7:a7", "network": {"id": "727cabe8-9053-4d06-9f7e-70dbb412fab9", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-375743393-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff7563f2fd6d456ca5a6feffe8a32082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffa25775-a0", "ovs_interfaceid": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:f7:a7,bridge_name='br-int',has_traffic_filtering=True,id=ffa25775-a078-439b-bc5a-4aa8ef9b3b2b,network=Network(727cabe8-9053-4d06-9f7e-70dbb412fab9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffa25775-a0') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lazy-loading 'pci_devices' on Instance uuid 7f0ddc97-b43e-4a5a-8690-7583f53320f0 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:10 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-f6c278fd-c6a8-4976-972d-9625e4a8a279 req-f0165f85-3012-426b-8b54-41546128309e service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] End _get_guest_xml xml= Mai 07 19:32:10 devstack nova-compute[86443]: 7f0ddc97-b43e-4a5a-8690-7583f53320f0 Mai 07 19:32:10 devstack nova-compute[86443]: instance-00000002 Mai 07 19:32:10 devstack nova-compute[86443]: 196608 Mai 07 19:32:10 devstack nova-compute[86443]: 1 Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: tempest-VolumesAssistedSnapshotsTest-server-367025004 Mai 07 19:32:10 devstack nova-compute[86443]: 2026-05-07 17:32:10 Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: 192 Mai 07 19:32:10 devstack nova-compute[86443]: 1 Mai 07 19:32:10 devstack nova-compute[86443]: 0 Mai 07 19:32:10 devstack nova-compute[86443]: 0 Mai 07 19:32:10 devstack nova-compute[86443]: 1 Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: True Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: bare Mai 07 19:32:10 devstack nova-compute[86443]: qcow2 Mai 07 19:32:10 devstack nova-compute[86443]: 1 Mai 07 19:32:10 devstack nova-compute[86443]: 0 Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: virtio Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: tempest-VolumesAssistedSnapshotsTest-937399013-project-member Mai 07 19:32:10 devstack nova-compute[86443]: tempest-VolumesAssistedSnapshotsTest-937399013 Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: OpenStack Foundation Mai 07 19:32:10 devstack nova-compute[86443]: OpenStack Nova Mai 07 19:32:10 devstack nova-compute[86443]: 33.1.0 Mai 07 19:32:10 devstack nova-compute[86443]: 7f0ddc97-b43e-4a5a-8690-7583f53320f0 Mai 07 19:32:10 devstack nova-compute[86443]: 7f0ddc97-b43e-4a5a-8690-7583f53320f0 Mai 07 19:32:10 devstack nova-compute[86443]: Virtual Machine Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: hvm Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: 1 Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: /dev/urandom Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: Mai 07 19:32:10 devstack nova-compute[86443]: {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8199}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Preparing to wait for external event network-vif-plugged-ffa25775-a078-439b-bc5a-4aa8ef9b3b2b {{(pid=86443) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:306}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Acquiring lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:31:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAssistedSnapshotsTest-server-367025004',display_name='tempest-VolumesAssistedSnapshotsTest-server-367025004',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-volumesassistedsnapshotstest-server-367025004',id=2,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO4w5hPBxKmd3vLt1iJD44reCQ0uqFrZUztfteqXGJNl11aaESOInYHsQrEQQB/y1JeJv2kdN3MUlgyyHmeLwK31PKytWfSeRKn35rXXESwYXLSIqlni7xjR7PTsUT6ptA==',key_name='tempest-keypair-1707075065',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ff7563f2fd6d456ca5a6feffe8a32082',ramdisk_id='',reservation_id='r-c5m0tai2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAssistedSnapshotsTest-937399013',owner_user_name='tempest-VolumesAssistedSnapshotsTest-937399013-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:32:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='f3230ea9529545f48fe6fafdb05a882b',uuid=7f0ddc97-b43e-4a5a-8690-7583f53320f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "address": "fa:16:3e:60:f7:a7", "network": {"id": "727cabe8-9053-4d06-9f7e-70dbb412fab9", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-375743393-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff7563f2fd6d456ca5a6feffe8a32082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffa25775-a0", "ovs_interfaceid": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) plug /opt/stack/nova/nova/virt/libvirt/vif.py:763}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Converting VIF {"id": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "address": "fa:16:3e:60:f7:a7", "network": {"id": "727cabe8-9053-4d06-9f7e-70dbb412fab9", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-375743393-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff7563f2fd6d456ca5a6feffe8a32082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffa25775-a0", "ovs_interfaceid": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:f7:a7,bridge_name='br-int',has_traffic_filtering=True,id=ffa25775-a078-439b-bc5a-4aa8ef9b3b2b,network=Network(727cabe8-9053-4d06-9f7e-70dbb412fab9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffa25775-a0') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG os_vif [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:f7:a7,bridge_name='br-int',has_traffic_filtering=True,id=ffa25775-a078-439b-bc5a-4aa8ef9b3b2b,network=Network(727cabe8-9053-4d06-9f7e-70dbb412fab9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffa25775-a0') {{(pid=86443) plug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:76}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'a389d4b0-1509-58f0-981a-9597b211ae53', '_type': 'linux-noop'}}, row=False) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapffa25775-a0, may_exist=True, interface_attrs={}) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapffa25775-a0, col_values=(('qos', UUID('457e5a1b-5b55-494a-920c-890b8e5620c9')),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapffa25775-a0, col_values=(('external_ids', {'iface-id': 'ffa25775-a078-439b-bc5a-4aa8ef9b3b2b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:f7:a7', 'vm-uuid': '7f0ddc97-b43e-4a5a-8690-7583f53320f0'}),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:32:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:10 devstack nova-compute[86443]: INFO os_vif [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:f7:a7,bridge_name='br-int',has_traffic_filtering=True,id=ffa25775-a078-439b-bc5a-4aa8ef9b3b2b,network=Network(727cabe8-9053-4d06-9f7e-70dbb412fab9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffa25775-a0') Mai 07 19:32:11 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-f6c278fd-c6a8-4976-972d-9625e4a8a279 req-f0165f85-3012-426b-8b54-41546128309e service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:11 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-f6c278fd-c6a8-4976-972d-9625e4a8a279 req-f0165f85-3012-426b-8b54-41546128309e service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Updated VIF entry in instance network info cache for port ffa25775-a078-439b-bc5a-4aa8ef9b3b2b. {{(pid=86443) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3521}} Mai 07 19:32:11 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-f6c278fd-c6a8-4976-972d-9625e4a8a279 req-f0165f85-3012-426b-8b54-41546128309e service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Updating instance_info_cache with network_info: [{"id": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "address": "fa:16:3e:60:f7:a7", "network": {"id": "727cabe8-9053-4d06-9f7e-70dbb412fab9", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-375743393-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff7563f2fd6d456ca5a6feffe8a32082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffa25775-a0", "ovs_interfaceid": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:32:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-f6c278fd-c6a8-4976-972d-9625e4a8a279 req-f0165f85-3012-426b-8b54-41546128309e service nova] Releasing lock "refresh_cache-7f0ddc97-b43e-4a5a-8690-7583f53320f0" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:32:12 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:32:12 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] No VIF found with MAC fa:16:3e:60:f7:a7, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:32:12 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:12 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:12 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-98c0892d-ee51-438a-9466-f20d45c6fb78 req-d2a8a8d6-75fc-4aed-bb4b-004a6a26adcb service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Received event network-vif-plugged-ffa25775-a078-439b-bc5a-4aa8ef9b3b2b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:32:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-98c0892d-ee51-438a-9466-f20d45c6fb78 req-d2a8a8d6-75fc-4aed-bb4b-004a6a26adcb service nova] Acquiring lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-98c0892d-ee51-438a-9466-f20d45c6fb78 req-d2a8a8d6-75fc-4aed-bb4b-004a6a26adcb service nova] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-98c0892d-ee51-438a-9466-f20d45c6fb78 req-d2a8a8d6-75fc-4aed-bb4b-004a6a26adcb service nova] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:12 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-98c0892d-ee51-438a-9466-f20d45c6fb78 req-d2a8a8d6-75fc-4aed-bb4b-004a6a26adcb service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Processing event network-vif-plugged-ffa25775-a078-439b-bc5a-4aa8ef9b3b2b {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11746}} Mai 07 19:32:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Acquiring lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:13 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:13 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:13 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:13 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:13 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:13 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Starting instance... {{(pid=86443) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2605}} Mai 07 19:32:13 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=86443) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:601}} Mai 07 19:32:13 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Started> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:32:13 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] VM Started (Lifecycle Event) Mai 07 19:32:13 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Guest created on hypervisor {{(pid=86443) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4893}} Mai 07 19:32:13 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Instance spawned successfully. Mai 07 19:32:13 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1012}} Mai 07 19:32:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:14 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=86443) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2630}} Mai 07 19:32:14 devstack nova-compute[86443]: INFO nova.compute.claims [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Claim successful on node devstack Mai 07 19:32:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:32:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:32:14 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Found default for hw_cdrom_bus of ide {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:32:14 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Found default for hw_disk_bus of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:32:14 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Found default for hw_input_bus of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:32:14 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Found default for hw_pointer_model of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:32:14 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Found default for hw_video_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:32:14 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Found default for hw_vif_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:32:14 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] During sync_power_state the instance has a pending task (spawning). Skip. Mai 07 19:32:14 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Paused> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:32:14 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] VM Paused (Lifecycle Event) Mai 07 19:32:14 devstack nova-compute[86443]: INFO nova.compute.manager [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Took 12.10 seconds to spawn the instance on the hypervisor. Mai 07 19:32:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:32:15 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-85a7068a-4b63-47a0-b622-2dcf73697ad8 req-cb9e0f1c-ac9e-4802-8cf3-7c291a358f40 service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Received event network-vif-plugged-ffa25775-a078-439b-bc5a-4aa8ef9b3b2b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:32:15 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-85a7068a-4b63-47a0-b622-2dcf73697ad8 req-cb9e0f1c-ac9e-4802-8cf3-7c291a358f40 service nova] Acquiring lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:15 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-85a7068a-4b63-47a0-b622-2dcf73697ad8 req-cb9e0f1c-ac9e-4802-8cf3-7c291a358f40 service nova] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:15 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-85a7068a-4b63-47a0-b622-2dcf73697ad8 req-cb9e0f1c-ac9e-4802-8cf3-7c291a358f40 service nova] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:15 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-85a7068a-4b63-47a0-b622-2dcf73697ad8 req-cb9e0f1c-ac9e-4802-8cf3-7c291a358f40 service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] No waiting events found dispatching network-vif-plugged-ffa25775-a078-439b-bc5a-4aa8ef9b3b2b {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:32:15 devstack nova-compute[86443]: WARNING nova.compute.manager [req-85a7068a-4b63-47a0-b622-2dcf73697ad8 req-cb9e0f1c-ac9e-4802-8cf3-7c291a358f40 service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Received unexpected event network-vif-plugged-ffa25775-a078-439b-bc5a-4aa8ef9b3b2b for instance with vm_state building and task_state spawning. Mai 07 19:32:15 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:32:15 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:32:15 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:32:15 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Resumed> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:32:15 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] VM Resumed (Lifecycle Event) Mai 07 19:32:15 devstack nova-compute[86443]: INFO nova.compute.manager [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Took 17.48 seconds to build instance. Mai 07 19:32:15 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:15 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:32:15 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:32:15 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:32:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18092fb0-a69f-467e-98d5-60436f27fb04 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 19.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.313s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:16 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Start building networks asynchronously for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3003}} Mai 07 19:32:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:16 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Allocating IP information in the background. {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2148}} Mai 07 19:32:16 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] allocate_for_instance() {{(pid=86443) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1187}} Mai 07 19:32:16 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:16 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:16 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:32:17 devstack nova-compute[86443]: DEBUG nova.policy [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '75fa165135a54f4594c98e4954f642ca', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b03cc5ee6fc644bb93377fc5016aca8b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=86443) authorize /opt/stack/nova/nova/policy.py:192}} Mai 07 19:32:17 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Mai 07 19:32:17 devstack nova-compute[86443]: INFO nova.compute.manager [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] instance snapshotting Mai 07 19:32:17 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Beginning live snapshot process Mai 07 19:32:17 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Start building block device mappings for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3038}} Mai 07 19:32:18 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Successfully created port: cd2e1acb-ff52-4bee-8bdb-556f1a50fd56 {{(pid=86443) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:529}} Mai 07 19:32:18 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-709fb139-6e11-489e-ba2a-a375652b1993 req-6d0e8ec5-9b15-4cf2-b3f2-5c537255ce6f service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Received event network-changed-ffa25775-a078-439b-bc5a-4aa8ef9b3b2b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:32:18 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-709fb139-6e11-489e-ba2a-a375652b1993 req-6d0e8ec5-9b15-4cf2-b3f2-5c537255ce6f service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Refreshing instance network info cache due to event network-changed-ffa25775-a078-439b-bc5a-4aa8ef9b3b2b. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:32:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-709fb139-6e11-489e-ba2a-a375652b1993 req-6d0e8ec5-9b15-4cf2-b3f2-5c537255ce6f service nova] Acquiring lock "refresh_cache-7f0ddc97-b43e-4a5a-8690-7583f53320f0" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:32:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-709fb139-6e11-489e-ba2a-a375652b1993 req-6d0e8ec5-9b15-4cf2-b3f2-5c537255ce6f service nova] Acquired lock "refresh_cache-7f0ddc97-b43e-4a5a-8690-7583f53320f0" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:32:18 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-709fb139-6e11-489e-ba2a-a375652b1993 req-6d0e8ec5-9b15-4cf2-b3f2-5c537255ce6f service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Refreshing network info cache for port ffa25775-a078-439b-bc5a-4aa8ef9b3b2b {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:32:18 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Start spawning the instance on the hypervisor. {{(pid=86443) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2811}} Mai 07 19:32:18 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Creating instance directory {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5215}} Mai 07 19:32:18 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Creating image(s) Mai 07 19:32:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Acquiring lock "/opt/stack/data/nova/instances/a9e2b18d-247f-4143-831c-3f98b7be0ef8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "/opt/stack/data/nova/instances/a9e2b18d-247f-4143-831c-3f98b7be0ef8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "/opt/stack/data/nova/instances/a9e2b18d-247f-4143-831c-3f98b7be0ef8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.005s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:18 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:32:18 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:32:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk --force-share --output=json -f qcow2 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.196s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Acquiring lock "d8d56ca44922efe85609619d01052c20f44c056a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk --force-share --output=json -f qcow2" returned: 0 in 0.219s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk --force-share --output=json -f qcow2 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:19 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-709fb139-6e11-489e-ba2a-a375652b1993 req-6d0e8ec5-9b15-4cf2-b3f2-5c537255ce6f service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.206s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/a9e2b18d-247f-4143-831c-3f98b7be0ef8/disk 1073741824 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Successfully updated port: cd2e1acb-ff52-4bee-8bdb-556f1a50fd56 {{(pid=86443) _update_port /opt/stack/nova/nova/network/neutron.py:567}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-e865323f-631f-4c3f-9a0e-6ea76bf95858 req-67de1e82-77ce-4346-8128-0256bf18514a service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Received event network-changed-cd2e1acb-ff52-4bee-8bdb-556f1a50fd56 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-e865323f-631f-4c3f-9a0e-6ea76bf95858 req-67de1e82-77ce-4346-8128-0256bf18514a service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Refreshing instance network info cache due to event network-changed-cd2e1acb-ff52-4bee-8bdb-556f1a50fd56. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-e865323f-631f-4c3f-9a0e-6ea76bf95858 req-67de1e82-77ce-4346-8128-0256bf18514a service nova] Acquiring lock "refresh_cache-a9e2b18d-247f-4143-831c-3f98b7be0ef8" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-e865323f-631f-4c3f-9a0e-6ea76bf95858 req-67de1e82-77ce-4346-8128-0256bf18514a service nova] Acquired lock "refresh_cache-a9e2b18d-247f-4143-831c-3f98b7be0ef8" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-e865323f-631f-4c3f-9a0e-6ea76bf95858 req-67de1e82-77ce-4346-8128-0256bf18514a service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Refreshing network info cache for port cd2e1acb-ff52-4bee-8bdb-556f1a50fd56 {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/a9e2b18d-247f-4143-831c-3f98b7be0ef8/disk 1073741824" returned: 0 in 0.145s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.387s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk --force-share --output=json -f qcow2" returned: 0 in 0.290s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.183s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Checking if we can resize image /opt/stack/data/nova/instances/a9e2b18d-247f-4143-831c-3f98b7be0ef8/disk. size=1073741824 {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:178}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a9e2b18d-247f-4143-831c-3f98b7be0ef8/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.285s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmptsaz4z0c/967ba7aac4ab4796a99bf1508f172a8b.delta 1073741824 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a9e2b18d-247f-4143-831c-3f98b7be0ef8/disk --force-share --output=json" returned: 0 in 0.193s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Cannot resize image /opt/stack/data/nova/instances/a9e2b18d-247f-4143-831c-3f98b7be0ef8/disk to a smaller size. {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:184}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Created local disks {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5347}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Ensure instance console log exists: /opt/stack/data/nova/instances/a9e2b18d-247f-4143-831c-3f98b7be0ef8/console.log {{(pid=86443) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:5094}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Acquiring lock "refresh_cache-a9e2b18d-247f-4143-831c-3f98b7be0ef8" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:32:20 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmptsaz4z0c/967ba7aac4ab4796a99bf1508f172a8b.delta 1073741824" returned: 0 in 0.074s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:20 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Quiescing instance not available: QEMU guest agent is not enabled. Mai 07 19:32:20 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-709fb139-6e11-489e-ba2a-a375652b1993 req-6d0e8ec5-9b15-4cf2-b3f2-5c537255ce6f service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:20 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-e865323f-631f-4c3f-9a0e-6ea76bf95858 req-67de1e82-77ce-4346-8128-0256bf18514a service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:20 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] COPY block job progress, current cursor: 0 final cursor: 76546048 {{(pid=86443) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:866}} Mai 07 19:32:20 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-e865323f-631f-4c3f-9a0e-6ea76bf95858 req-67de1e82-77ce-4346-8128-0256bf18514a service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:32:20 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-709fb139-6e11-489e-ba2a-a375652b1993 req-6d0e8ec5-9b15-4cf2-b3f2-5c537255ce6f service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Updated VIF entry in instance network info cache for port ffa25775-a078-439b-bc5a-4aa8ef9b3b2b. {{(pid=86443) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3521}} Mai 07 19:32:20 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-709fb139-6e11-489e-ba2a-a375652b1993 req-6d0e8ec5-9b15-4cf2-b3f2-5c537255ce6f service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Updating instance_info_cache with network_info: [{"id": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "address": "fa:16:3e:60:f7:a7", "network": {"id": "727cabe8-9053-4d06-9f7e-70dbb412fab9", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-375743393-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.56", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff7563f2fd6d456ca5a6feffe8a32082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffa25775-a0", "ovs_interfaceid": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:32:20 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-e865323f-631f-4c3f-9a0e-6ea76bf95858 req-67de1e82-77ce-4346-8128-0256bf18514a service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:32:20 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] COPY block job progress, current cursor: 76546048 final cursor: 76546048 {{(pid=86443) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:866}} Mai 07 19:32:20 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Skipping quiescing instance: QEMU guest agent is not enabled. Mai 07 19:32:20 devstack nova-compute[86443]: DEBUG nova.privsep.utils [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=86443) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Mai 07 19:32:20 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmptsaz4z0c/967ba7aac4ab4796a99bf1508f172a8b.delta /opt/stack/data/nova/instances/snapshots/tmptsaz4z0c/967ba7aac4ab4796a99bf1508f172a8b {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:20 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-709fb139-6e11-489e-ba2a-a375652b1993 req-6d0e8ec5-9b15-4cf2-b3f2-5c537255ce6f service nova] Releasing lock "refresh_cache-7f0ddc97-b43e-4a5a-8690-7583f53320f0" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:32:20 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmptsaz4z0c/967ba7aac4ab4796a99bf1508f172a8b.delta /opt/stack/data/nova/instances/snapshots/tmptsaz4z0c/967ba7aac4ab4796a99bf1508f172a8b" returned: 0 in 0.340s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:21 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Snapshot extracted, beginning image upload Mai 07 19:32:21 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-e865323f-631f-4c3f-9a0e-6ea76bf95858 req-67de1e82-77ce-4346-8128-0256bf18514a service nova] Releasing lock "refresh_cache-a9e2b18d-247f-4143-831c-3f98b7be0ef8" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:32:21 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Acquired lock "refresh_cache-a9e2b18d-247f-4143-831c-3f98b7be0ef8" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:32:21 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:32:21 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:21 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:32:21 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Updating instance_info_cache with network_info: [{"id": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "address": "fa:16:3e:ec:68:04", "network": {"id": "33148e37-2b9b-4544-b8ce-a1d9b2a933a3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1492721874-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b03cc5ee6fc644bb93377fc5016aca8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd2e1acb-ff", "ovs_interfaceid": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Releasing lock "refresh_cache-a9e2b18d-247f-4143-831c-3f98b7be0ef8" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Instance network_info: |[{"id": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "address": "fa:16:3e:ec:68:04", "network": {"id": "33148e37-2b9b-4544-b8ce-a1d9b2a933a3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1492721874-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b03cc5ee6fc644bb93377fc5016aca8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd2e1acb-ff", "ovs_interfaceid": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2163}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Start _get_guest_xml network_info=[{"id": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "address": "fa:16:3e:ec:68:04", "network": {"id": "33148e37-2b9b-4544-b8ce-a1d9b2a933a3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1492721874-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b03cc5ee6fc644bb93377fc5016aca8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd2e1acb-ff", "ovs_interfaceid": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'image_id': 'e8633b10-b98a-4580-90f8-3091ca40fa29'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8192}} Mai 07 19:32:22 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='e8633b10-b98a-4580-90f8-3091ca40fa29', instance_meta=NovaInstanceMeta(name='tempest-DeleteServersTestJSON-server-76879314', uuid='a9e2b18d-247f-4143-831c-3f98b7be0ef8'), owner=OwnerMeta(userid='75fa165135a54f4594c98e4954f642ca', username='tempest-DeleteServersTestJSON-416667827-project-member', projectid='b03cc5ee6fc644bb93377fc5016aca8b', projectname='tempest-DeleteServersTestJSON-416667827'), image=ImageMeta(id='e8633b10-b98a-4580-90f8-3091ca40fa29', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "address": "fa:16:3e:ec:68:04", "network": {"id": "33148e37-2b9b-4544-b8ce-a1d9b2a933a3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1492721874-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b03cc5ee6fc644bb93377fc5016aca8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd2e1acb-ff", "ovs_interfaceid": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175142.659523) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Searching host: 'devstack' for CPU controller through CGroups V1... {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1783}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] CPU controller missing on host. {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1793}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Searching host: 'devstack' for CPU controller through CGroups V2... {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1802}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] CPU controller found on host. {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1809}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Getting desirable topologies for flavor Flavor(created_at=2026-05-07T17:26:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=192,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=), allow threads: True {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:703}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Flavor limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:488}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Image limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:492}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Flavor pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:528}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Image pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:532}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:570}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:709}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:611}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Got 1 possible topologies {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:641}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:715}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:717}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:32:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-76879314',display_name='tempest-DeleteServersTestJSON-server-76879314',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-deleteserverstestjson-server-76879314',id=3,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b03cc5ee6fc644bb93377fc5016aca8b',ramdisk_id='',reservation_id='r-bf9u8uzo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-416667827',owner_user_name='tempest-DeleteServersTestJSON-416667827-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:32:18Z,user_data=None,user_id='75fa165135a54f4594c98e4954f642ca',uuid=a9e2b18d-247f-4143-831c-3f98b7be0ef8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "address": "fa:16:3e:ec:68:04", "network": {"id": "33148e37-2b9b-4544-b8ce-a1d9b2a933a3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1492721874-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b03cc5ee6fc644bb93377fc5016aca8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd2e1acb-ff", "ovs_interfaceid": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Converting VIF {"id": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "address": "fa:16:3e:ec:68:04", "network": {"id": "33148e37-2b9b-4544-b8ce-a1d9b2a933a3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1492721874-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b03cc5ee6fc644bb93377fc5016aca8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd2e1acb-ff", "ovs_interfaceid": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:68:04,bridge_name='br-int',has_traffic_filtering=True,id=cd2e1acb-ff52-4bee-8bdb-556f1a50fd56,network=Network(33148e37-2b9b-4544-b8ce-a1d9b2a933a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd2e1acb-ff') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:32:22 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lazy-loading 'pci_devices' on Instance uuid a9e2b18d-247f-4143-831c-3f98b7be0ef8 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] End _get_guest_xml xml= Mai 07 19:32:23 devstack nova-compute[86443]: a9e2b18d-247f-4143-831c-3f98b7be0ef8 Mai 07 19:32:23 devstack nova-compute[86443]: instance-00000003 Mai 07 19:32:23 devstack nova-compute[86443]: 196608 Mai 07 19:32:23 devstack nova-compute[86443]: 1 Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: tempest-DeleteServersTestJSON-server-76879314 Mai 07 19:32:23 devstack nova-compute[86443]: 2026-05-07 17:32:22 Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: 192 Mai 07 19:32:23 devstack nova-compute[86443]: 1 Mai 07 19:32:23 devstack nova-compute[86443]: 0 Mai 07 19:32:23 devstack nova-compute[86443]: 0 Mai 07 19:32:23 devstack nova-compute[86443]: 1 Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: True Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: bare Mai 07 19:32:23 devstack nova-compute[86443]: qcow2 Mai 07 19:32:23 devstack nova-compute[86443]: 1 Mai 07 19:32:23 devstack nova-compute[86443]: 0 Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: virtio Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: tempest-DeleteServersTestJSON-416667827-project-member Mai 07 19:32:23 devstack nova-compute[86443]: tempest-DeleteServersTestJSON-416667827 Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: OpenStack Foundation Mai 07 19:32:23 devstack nova-compute[86443]: OpenStack Nova Mai 07 19:32:23 devstack nova-compute[86443]: 33.1.0 Mai 07 19:32:23 devstack nova-compute[86443]: a9e2b18d-247f-4143-831c-3f98b7be0ef8 Mai 07 19:32:23 devstack nova-compute[86443]: a9e2b18d-247f-4143-831c-3f98b7be0ef8 Mai 07 19:32:23 devstack nova-compute[86443]: Virtual Machine Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: hvm Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: 1 Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: /dev/urandom Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: Mai 07 19:32:23 devstack nova-compute[86443]: {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8199}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Preparing to wait for external event network-vif-plugged-cd2e1acb-ff52-4bee-8bdb-556f1a50fd56 {{(pid=86443) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:306}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Acquiring lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.012s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:32:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-76879314',display_name='tempest-DeleteServersTestJSON-server-76879314',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-deleteserverstestjson-server-76879314',id=3,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b03cc5ee6fc644bb93377fc5016aca8b',ramdisk_id='',reservation_id='r-bf9u8uzo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-416667827',owner_user_name='tempest-DeleteServersTestJSON-416667827-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:32:18Z,user_data=None,user_id='75fa165135a54f4594c98e4954f642ca',uuid=a9e2b18d-247f-4143-831c-3f98b7be0ef8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "address": "fa:16:3e:ec:68:04", "network": {"id": "33148e37-2b9b-4544-b8ce-a1d9b2a933a3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1492721874-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b03cc5ee6fc644bb93377fc5016aca8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd2e1acb-ff", "ovs_interfaceid": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) plug /opt/stack/nova/nova/virt/libvirt/vif.py:763}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Converting VIF {"id": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "address": "fa:16:3e:ec:68:04", "network": {"id": "33148e37-2b9b-4544-b8ce-a1d9b2a933a3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1492721874-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b03cc5ee6fc644bb93377fc5016aca8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd2e1acb-ff", "ovs_interfaceid": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:68:04,bridge_name='br-int',has_traffic_filtering=True,id=cd2e1acb-ff52-4bee-8bdb-556f1a50fd56,network=Network(33148e37-2b9b-4544-b8ce-a1d9b2a933a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd2e1acb-ff') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG os_vif [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:68:04,bridge_name='br-int',has_traffic_filtering=True,id=cd2e1acb-ff52-4bee-8bdb-556f1a50fd56,network=Network(33148e37-2b9b-4544-b8ce-a1d9b2a933a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd2e1acb-ff') {{(pid=86443) plug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:76}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c2ea0adc-b754-5586-ba61-02c43e1ba625', '_type': 'linux-noop'}}, row=False) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd2e1acb-ff, may_exist=True, interface_attrs={}) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapcd2e1acb-ff, col_values=(('qos', UUID('f02ac31d-ceca-4a3b-aede-7ff55fe01574')),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapcd2e1acb-ff, col_values=(('external_ids', {'iface-id': 'cd2e1acb-ff52-4bee-8bdb-556f1a50fd56', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ec:68:04', 'vm-uuid': 'a9e2b18d-247f-4143-831c-3f98b7be0ef8'}),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:32:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:23 devstack nova-compute[86443]: INFO os_vif [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:68:04,bridge_name='br-int',has_traffic_filtering=True,id=cd2e1acb-ff52-4bee-8bdb-556f1a50fd56,network=Network(33148e37-2b9b-4544-b8ce-a1d9b2a933a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd2e1acb-ff') Mai 07 19:32:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:32:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] No VIF found with MAC fa:16:3e:ec:68:04, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:32:25 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:25 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:25 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Snapshot image upload complete Mai 07 19:32:25 devstack nova-compute[86443]: INFO nova.compute.manager [None req-01e96f19-38aa-43a1-807d-0aafa842c077 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Took 7.81 seconds to snapshot the instance on the hypervisor. Mai 07 19:32:25 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-a35b6cb9-019c-41e3-b1ef-2527fcb69db6 req-24f1251d-9881-400d-83f0-2a6fadfb483d service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Received event network-vif-plugged-cd2e1acb-ff52-4bee-8bdb-556f1a50fd56 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:32:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-a35b6cb9-019c-41e3-b1ef-2527fcb69db6 req-24f1251d-9881-400d-83f0-2a6fadfb483d service nova] Acquiring lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-a35b6cb9-019c-41e3-b1ef-2527fcb69db6 req-24f1251d-9881-400d-83f0-2a6fadfb483d service nova] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-a35b6cb9-019c-41e3-b1ef-2527fcb69db6 req-24f1251d-9881-400d-83f0-2a6fadfb483d service nova] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:25 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-a35b6cb9-019c-41e3-b1ef-2527fcb69db6 req-24f1251d-9881-400d-83f0-2a6fadfb483d service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Processing event network-vif-plugged-cd2e1acb-ff52-4bee-8bdb-556f1a50fd56 {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11746}} Mai 07 19:32:26 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:26 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:26 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:26 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:26 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:26 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Started> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:32:26 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] VM Started (Lifecycle Event) Mai 07 19:32:26 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=86443) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:601}} Mai 07 19:32:26 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Guest created on hypervisor {{(pid=86443) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4893}} Mai 07 19:32:26 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Instance spawned successfully. Mai 07 19:32:26 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1012}} Mai 07 19:32:27 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:32:27 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:32:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Found default for hw_cdrom_bus of ide {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:32:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Found default for hw_disk_bus of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:32:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Found default for hw_input_bus of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:32:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Found default for hw_pointer_model of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:32:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Found default for hw_video_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:32:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Found default for hw_vif_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:32:27 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] During sync_power_state the instance has a pending task (spawning). Skip. Mai 07 19:32:27 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Paused> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:32:27 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] VM Paused (Lifecycle Event) Mai 07 19:32:27 devstack nova-compute[86443]: INFO nova.compute.manager [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Took 8.67 seconds to spawn the instance on the hypervisor. Mai 07 19:32:27 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:32:27 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-ff2afe1f-8012-4ed9-86eb-7d4f4fa1378a req-7d6c15e6-03a0-4e2e-969d-a69f7479bd19 service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Received event network-vif-plugged-cd2e1acb-ff52-4bee-8bdb-556f1a50fd56 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:32:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-ff2afe1f-8012-4ed9-86eb-7d4f4fa1378a req-7d6c15e6-03a0-4e2e-969d-a69f7479bd19 service nova] Acquiring lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-ff2afe1f-8012-4ed9-86eb-7d4f4fa1378a req-7d6c15e6-03a0-4e2e-969d-a69f7479bd19 service nova] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-ff2afe1f-8012-4ed9-86eb-7d4f4fa1378a req-7d6c15e6-03a0-4e2e-969d-a69f7479bd19 service nova] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:27 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-ff2afe1f-8012-4ed9-86eb-7d4f4fa1378a req-7d6c15e6-03a0-4e2e-969d-a69f7479bd19 service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] No waiting events found dispatching network-vif-plugged-cd2e1acb-ff52-4bee-8bdb-556f1a50fd56 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:32:27 devstack nova-compute[86443]: WARNING nova.compute.manager [req-ff2afe1f-8012-4ed9-86eb-7d4f4fa1378a req-7d6c15e6-03a0-4e2e-969d-a69f7479bd19 service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Received unexpected event network-vif-plugged-cd2e1acb-ff52-4bee-8bdb-556f1a50fd56 for instance with vm_state active and task_state None. Mai 07 19:32:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:28 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:32:28 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Resumed> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:32:28 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] VM Resumed (Lifecycle Event) Mai 07 19:32:28 devstack nova-compute[86443]: INFO nova.compute.manager [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Took 14.11 seconds to build instance. Mai 07 19:32:28 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:28 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lazy-loading 'flavor' on Instance uuid fbda5d5e-085d-499a-9b6c-5e8e388d5363 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:28 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:32:28 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:32:28 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.653s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:29 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 1.522s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:30 devstack nova-compute[86443]: INFO nova.compute.manager [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Attaching volume 605a4e7c-0cad-4174-8367-46d132450223 to /dev/vdb Mai 07 19:32:30 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.166', 'multipath': False, 'enforce_multipath': True, 'host': 'devstack', 'execute': None}" {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:175}} Mai 07 19:32:30 devstack nova-compute[86443]: INFO oslo.privsep.daemon [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmpn9npor2h/privsep.sock'] Mai 07 19:32:30 devstack sudo[88050]: quobyte : PWD=/ ; USER=root ; COMMAND=/opt/stack/data/venv/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context os_brick.privileged.default --privsep_sock_path /tmp/tmpn9npor2h/privsep.sock Mai 07 19:32:30 devstack sudo[88050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000) Mai 07 19:32:31 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:31 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Acquiring lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:31 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:32 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lazy-loading 'flavor' on Instance uuid a9e2b18d-247f-4143-831c-3f98b7be0ef8 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:32 devstack sudo[88050]: pam_unix(sudo:session): session closed for user root Mai 07 19:32:32 devstack nova-compute[86443]: INFO oslo.privsep.daemon [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Spawned new privsep daemon via rootwrap Mai 07 19:32:32 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.scaleio [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Failed to query sdc guid: [Errno 2] No such file or directory {{(pid=86443) _get_guid /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/scaleio.py:91}} Mai 07 19:32:32 devstack nova-compute[86443]: WARNING os_brick.initiator.connectors.nvmeof [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Could not find nvme_core/parameters/multipath: FileNotFoundError: [Errno 2] No such file or directory: '/sys/module/nvme_core/parameters/multipath' Mai 07 19:32:32 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): nvme version {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:32 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] 'nvme version' failed. Not Retrying. {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:543}} Mai 07 19:32:32 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.nvmeof [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] nvme not present on system {{(pid=86443) nvme_present /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/nvmeof.py:782}} Mai 07 19:32:32 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] LIGHTOS: [Errno 111] Connection refused {{(pid=86443) find_dsc /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:161}} Mai 07 19:32:32 devstack nova-compute[86443]: INFO os_brick.initiator.connectors.lightos [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Current host hostNQN and IP(s) are ['192.168.122.166', 'fe80::5054:ff:fe02:c903', '172.24.4.1', '2001:db8::2', 'fe80::e4ed:27ff:fed0:dc45', 'fe80::fc16:3eff:feca:f2d2', 'fe80::6466:37ff:fed8:343c', 'fe80::fc16:3eff:fe60:f7a7', 'fe80::38fa:a9ff:fe31:86b8', 'fe80::fc16:3eff:feec:6804', 'fe80::d47b:30ff:fe51:ed5b'] Mai 07 19:32:32 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] LIGHTOS: did not find dsc, continuing anyway. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:136}} Mai 07 19:32:32 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] LIGHTOS: no hostnqn found. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:145}} Mai 07 19:32:32 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] <== get_connector_properties: return (1807ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.166', 'host': 'devstack', 'multipath': False, 'enforce_multipath': True, 'initiator': 'iqn.2016-04.com.open-iscsi:ab61f14d7e3e', 'do_local_attach': False, 'uuid': 'e51d0ed5-0776-4376-a81f-1e084ffcb1c6', 'system uuid': '1edef36a-6b3a-4b67-b01c-d6a682c117a8', 'nvme_native_multipath': False} {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:202}} Mai 07 19:32:32 devstack nova-compute[86443]: DEBUG nova.virt.block_device [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Updating existing volume attachment record: e56c1cdf-90f4-4dd0-bfc8-60b44d58691e {{(pid=86443) _volume_attach /opt/stack/nova/nova/virt/block_device.py:666}} Mai 07 19:32:33 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:33 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 1.522s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:33 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "cache_volume_driver" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.._cache_volume_driver" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:33 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.._cache_volume_driver" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:33 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "cache_volume_driver" "released" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.._cache_volume_driver" :: held 0.005s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:33 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:33 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:33 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): systemctl is-system-running {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:33 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "systemctl is-system-running" returned: 0 in 0.039s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:33 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.volume.quobyte [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] systemd detected. {{(pid=86443) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/quobyte.py:167}} Mai 07 19:32:33 devstack nova-compute[86443]: WARNING oslo_config.cfg [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Deprecated: Option "quobyte_mount_point_base" from group "libvirt" is deprecated for removal ( Mai 07 19:32:33 devstack nova-compute[86443]: Quobyte volume driver in cinder was marked unsupported. Quobyte volume support Mai 07 19:32:33 devstack nova-compute[86443]: will be removed from nova when the volume driver is removed from cinder. Mai 07 19:32:33 devstack nova-compute[86443]: ). Its value may be silently ignored in the future. Mai 07 19:32:33 devstack nova-compute[86443]: WARNING oslo_config.cfg [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Deprecated: Option "quobyte_client_cfg" from group "libvirt" is deprecated for removal ( Mai 07 19:32:33 devstack nova-compute[86443]: Quobyte volume driver in cinder was marked unsupported. Quobyte volume support Mai 07 19:32:33 devstack nova-compute[86443]: will be removed from nova when the volume driver is removed from cinder. Mai 07 19:32:33 devstack nova-compute[86443]: ). Its value may be silently ignored in the future. Mai 07 19:32:33 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.volume.quobyte [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Mounting volume osci02.corp.quobyte.com/cinder-vol-1d971a4c-1ce1-46c7-a94d-347e695e16aa at mount point /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22 via systemd-run {{(pid=86443) mount_volume /opt/stack/nova/nova/virt/libvirt/volume/quobyte.py:79}} Mai 07 19:32:34 devstack nova-compute[86443]: INFO nova.virt.libvirt.volume.quobyte [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Mounted volume: osci02.corp.quobyte.com/cinder-vol-1d971a4c-1ce1-46c7-a94d-347e695e16aa Mai 07 19:32:34 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: held 0.568s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:34 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lazy-loading 'flavor' on Instance uuid fbda5d5e-085d-499a-9b6c-5e8e388d5363 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:34 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] attach device xml: Mai 07 19:32:34 devstack nova-compute[86443]: Mai 07 19:32:34 devstack nova-compute[86443]: Mai 07 19:32:34 devstack nova-compute[86443]: Mai 07 19:32:34 devstack nova-compute[86443]: Mai 07 19:32:34 devstack nova-compute[86443]: 605a4e7c-0cad-4174-8367-46d132450223 Mai 07 19:32:34 devstack nova-compute[86443]: Mai 07 19:32:34 devstack nova-compute[86443]: {{(pid=86443) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:351}} Mai 07 19:32:34 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Acquiring lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:34 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:34 devstack nova-compute[86443]: INFO nova.compute.manager [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Attaching volume e5dd674b-fe9e-4649-b8b8-3ea9a41ede96 to /dev/vdb Mai 07 19:32:35 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.166', 'multipath': False, 'enforce_multipath': True, 'host': 'devstack', 'execute': None}" {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:175}} Mai 07 19:32:35 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.scaleio [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Failed to query sdc guid: [Errno 2] No such file or directory {{(pid=86443) _get_guid /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/scaleio.py:91}} Mai 07 19:32:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Running cmd (subprocess): nvme version {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] 'nvme version' failed. Not Retrying. {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:543}} Mai 07 19:32:35 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.nvmeof [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] nvme not present on system {{(pid=86443) nvme_present /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/nvmeof.py:782}} Mai 07 19:32:35 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] LIGHTOS: [Errno 111] Connection refused {{(pid=86443) find_dsc /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:161}} Mai 07 19:32:35 devstack nova-compute[86443]: INFO os_brick.initiator.connectors.lightos [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Current host hostNQN and IP(s) are ['192.168.122.166', 'fe80::5054:ff:fe02:c903', '172.24.4.1', '2001:db8::2', 'fe80::e4ed:27ff:fed0:dc45', 'fe80::fc16:3eff:feca:f2d2', 'fe80::6466:37ff:fed8:343c', 'fe80::fc16:3eff:fe60:f7a7', 'fe80::38fa:a9ff:fe31:86b8', 'fe80::fc16:3eff:feec:6804', 'fe80::d47b:30ff:fe51:ed5b'] Mai 07 19:32:35 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] LIGHTOS: did not find dsc, continuing anyway. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:136}} Mai 07 19:32:35 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] LIGHTOS: no hostnqn found. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:145}} Mai 07 19:32:35 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] <== get_connector_properties: return (124ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.166', 'host': 'devstack', 'multipath': False, 'enforce_multipath': True, 'initiator': 'iqn.2016-04.com.open-iscsi:ab61f14d7e3e', 'do_local_attach': False, 'uuid': 'e51d0ed5-0776-4376-a81f-1e084ffcb1c6', 'system uuid': '1edef36a-6b3a-4b67-b01c-d6a682c117a8', 'nvme_native_multipath': False} {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:202}} Mai 07 19:32:35 devstack nova-compute[86443]: DEBUG nova.virt.block_device [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Updating existing volume attachment record: fdac0afb-f0ed-4b4f-bf1e-439d284a6106 {{(pid=86443) _volume_attach /opt/stack/nova/nova/virt/block_device.py:666}} Mai 07 19:32:36 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:36 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:36 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.volume.quobyte [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] systemd detected. {{(pid=86443) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/quobyte.py:167}} Mai 07 19:32:36 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: held 0.009s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:36 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lazy-loading 'flavor' on Instance uuid a9e2b18d-247f-4143-831c-3f98b7be0ef8 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:36 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:36 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:32:36 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] No BDM found with device name vdb, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:32:36 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] No VIF found with MAC fa:16:3e:ca:f2:d2, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:32:36 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] attach device xml: Mai 07 19:32:36 devstack nova-compute[86443]: Mai 07 19:32:36 devstack nova-compute[86443]: Mai 07 19:32:36 devstack nova-compute[86443]: Mai 07 19:32:36 devstack nova-compute[86443]: Mai 07 19:32:36 devstack nova-compute[86443]: e5dd674b-fe9e-4649-b8b8-3ea9a41ede96 Mai 07 19:32:36 devstack nova-compute[86443]: Mai 07 19:32:36 devstack nova-compute[86443]: {{(pid=86443) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:351}} Mai 07 19:32:38 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:38 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-867e3fa7-facc-4dda-be17-2bfbb7e1812e tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" "released" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 7.624s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:38 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:32:38 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] No BDM found with device name vdb, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:32:38 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] No VIF found with MAC fa:16:3e:ec:68:04, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:32:38 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Acquiring lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:38 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:39 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Lazy-loading 'flavor' on Instance uuid 7f0ddc97-b43e-4a5a-8690-7583f53320f0 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:39 devstack nova-compute[86443]: INFO nova.compute.manager [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Rescuing Mai 07 19:32:39 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:32:39 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquired lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:32:39 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:32:40 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-d3be192a-11cd-4178-b7d6-29095471f26b tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8" "released" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 5.271s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 1.527s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:40 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:40 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Updating instance_info_cache with network_info: [{"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.153", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Acquiring lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Acquiring lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:41 devstack nova-compute[86443]: INFO nova.compute.manager [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Terminating instance Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Releasing lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Acquiring lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:41 devstack nova-compute[86443]: INFO nova.compute.manager [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Attaching volume f85ecce2-9e09-4c47-ba66-26a684a5cb5f to /dev/vdb Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.166', 'multipath': False, 'enforce_multipath': True, 'host': 'devstack', 'execute': None}" {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:175}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.scaleio [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Failed to query sdc guid: [Errno 2] No such file or directory {{(pid=86443) _get_guid /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/scaleio.py:91}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Start destroying the instance on the hypervisor. {{(pid=86443) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3332}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Running cmd (subprocess): nvme version {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] 'nvme version' failed. Not Retrying. {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:543}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.nvmeof [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] nvme not present on system {{(pid=86443) nvme_present /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/nvmeof.py:782}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] LIGHTOS: [Errno 111] Connection refused {{(pid=86443) find_dsc /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:161}} Mai 07 19:32:41 devstack nova-compute[86443]: INFO os_brick.initiator.connectors.lightos [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Current host hostNQN and IP(s) are ['192.168.122.166', 'fe80::5054:ff:fe02:c903', '172.24.4.1', '2001:db8::2', 'fe80::e4ed:27ff:fed0:dc45', 'fe80::fc16:3eff:feca:f2d2', 'fe80::6466:37ff:fed8:343c', 'fe80::fc16:3eff:fe60:f7a7', 'fe80::38fa:a9ff:fe31:86b8', 'fe80::d47b:30ff:fe51:ed5b'] Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] LIGHTOS: did not find dsc, continuing anyway. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:136}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] LIGHTOS: no hostnqn found. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:145}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] <== get_connector_properties: return (101ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.166', 'host': 'devstack', 'multipath': False, 'enforce_multipath': True, 'initiator': 'iqn.2016-04.com.open-iscsi:ab61f14d7e3e', 'do_local_attach': False, 'uuid': 'e51d0ed5-0776-4376-a81f-1e084ffcb1c6', 'system uuid': '1edef36a-6b3a-4b67-b01c-d6a682c117a8', 'nvme_native_multipath': False} {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:202}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG nova.virt.block_device [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Updating existing volume attachment record: 1d1dadc8-56c6-47f8-a028-37c4dd130d60 {{(pid=86443) _volume_attach /opt/stack/nova/nova/virt/block_device.py:666}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG nova.utils [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Queued Task(fn=>, remaining_delay=14.993657011000096 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG nova.utils [-] Received Task(fn=>, remaining_delay=14.99301000100013 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG nova.utils [-] Waitig for the deadline of Task(fn=>, remaining_delay=14.992177213000105 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:32:41 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Instance destroyed successfully. Mai 07 19:32:41 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lazy-loading 'resources' on Instance uuid a9e2b18d-247f-4143-831c-3f98b7be0ef8 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-7a30f241-d9cf-48a2-82bb-33178dbf65c7 req-461a5fe9-21d1-4475-bf4c-2052c4c1820d service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Received event network-vif-unplugged-cd2e1acb-ff52-4bee-8bdb-556f1a50fd56 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-7a30f241-d9cf-48a2-82bb-33178dbf65c7 req-461a5fe9-21d1-4475-bf4c-2052c4c1820d service nova] Acquiring lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-7a30f241-d9cf-48a2-82bb-33178dbf65c7 req-461a5fe9-21d1-4475-bf4c-2052c4c1820d service nova] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-7a30f241-d9cf-48a2-82bb-33178dbf65c7 req-461a5fe9-21d1-4475-bf4c-2052c4c1820d service nova] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-7a30f241-d9cf-48a2-82bb-33178dbf65c7 req-461a5fe9-21d1-4475-bf4c-2052c4c1820d service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] No waiting events found dispatching network-vif-unplugged-cd2e1acb-ff52-4bee-8bdb-556f1a50fd56 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-7a30f241-d9cf-48a2-82bb-33178dbf65c7 req-461a5fe9-21d1-4475-bf4c-2052c4c1820d service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Received event network-vif-unplugged-cd2e1acb-ff52-4bee-8bdb-556f1a50fd56 for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:32:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-76879314',display_name='tempest-DeleteServersTestJSON-server-76879314',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-deleteserverstestjson-server-76879314',id=3,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-05-07T17:32:27Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b03cc5ee6fc644bb93377fc5016aca8b',ramdisk_id='',reservation_id='r-bf9u8uzo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-DeleteServersTestJSON-416667827',owner_user_name='tempest-DeleteServersTestJSON-416667827-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2026-05-07T17:32:28Z,user_data=None,user_id='75fa165135a54f4594c98e4954f642ca',uuid=a9e2b18d-247f-4143-831c-3f98b7be0ef8,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "address": "fa:16:3e:ec:68:04", "network": {"id": "33148e37-2b9b-4544-b8ce-a1d9b2a933a3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1492721874-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b03cc5ee6fc644bb93377fc5016aca8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd2e1acb-ff", "ovs_interfaceid": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:881}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Converting VIF {"id": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "address": "fa:16:3e:ec:68:04", "network": {"id": "33148e37-2b9b-4544-b8ce-a1d9b2a933a3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1492721874-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b03cc5ee6fc644bb93377fc5016aca8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd2e1acb-ff", "ovs_interfaceid": "cd2e1acb-ff52-4bee-8bdb-556f1a50fd56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:68:04,bridge_name='br-int',has_traffic_filtering=True,id=cd2e1acb-ff52-4bee-8bdb-556f1a50fd56,network=Network(33148e37-2b9b-4544-b8ce-a1d9b2a933a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd2e1acb-ff') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG os_vif [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:68:04,bridge_name='br-int',has_traffic_filtering=True,id=cd2e1acb-ff52-4bee-8bdb-556f1a50fd56,network=Network(33148e37-2b9b-4544-b8ce-a1d9b2a933a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd2e1acb-ff') {{(pid=86443) unplug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:109}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd2e1acb-ff, bridge=br-int, if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=f02ac31d-ceca-4a3b-aede-7ff55fe01574) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:32:42 devstack nova-compute[86443]: INFO os_vif [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:68:04,bridge_name='br-int',has_traffic_filtering=True,id=cd2e1acb-ff52-4bee-8bdb-556f1a50fd56,network=Network(33148e37-2b9b-4544-b8ce-a1d9b2a933a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd2e1acb-ff') Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:42 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Couldn't unmount the Quobyte Volume at /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command. Mai 07 19:32:42 devstack nova-compute[86443]: Command: umount /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22 Mai 07 19:32:42 devstack nova-compute[86443]: Exit code: 32 Mai 07 19:32:42 devstack nova-compute[86443]: Stdout: '' Mai 07 19:32:42 devstack nova-compute[86443]: Stderr: 'umount: /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22: target is busy.\n' Mai 07 19:32:42 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Traceback (most recent call last): Mai 07 19:32:42 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte File "/opt/stack/nova/nova/virt/libvirt/volume/quobyte.py", line 96, in umount_volume Mai 07 19:32:42 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte nova.privsep.libvirt.umount(mnt_base) Mai 07 19:32:42 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_privsep/priv_context.py", line 315, in _wrap Mai 07 19:32:42 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte return self.channel.remote_call(name, args, kwargs, r_call_timeout) Mai 07 19:32:42 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:42 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_privsep/daemon.py", line 262, in remote_call Mai 07 19:32:42 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte raise exc_type(*result[2]) Mai 07 19:32:42 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command. Mai 07 19:32:42 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Command: umount /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22 Mai 07 19:32:42 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Exit code: 32 Mai 07 19:32:42 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Stdout: '' Mai 07 19:32:42 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Stderr: 'umount: /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22: target is busy.\n' Mai 07 19:32:42 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: held 0.025s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:42 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Deleting instance files /opt/stack/data/nova/instances/a9e2b18d-247f-4143-831c-3f98b7be0ef8_del Mai 07 19:32:42 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Deletion of /opt/stack/data/nova/instances/a9e2b18d-247f-4143-831c-3f98b7be0ef8_del complete Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=86443) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:11402}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager.update_available_resource {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:43 devstack nova-compute[86443]: INFO nova.compute.manager [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Took 1.45 seconds to destroy the instance on the hypervisor. Mai 07 19:32:43 devstack nova-compute[86443]: DEBUG oslo.service.backend._common.loopingcall [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=86443) func /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_common/loopingcall.py:419}} Mai 07 19:32:43 devstack nova-compute[86443]: DEBUG nova.compute.manager [-] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Deallocating network for instance {{(pid=86443) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2456}} Mai 07 19:32:43 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] deallocate_for_instance() {{(pid=86443) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1842}} Mai 07 19:32:43 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:43 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:43 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:43 devstack nova-compute[86443]: DEBUG nova.utils [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Queued Task(fn=>, remaining_delay=14.999416126999904 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:32:43 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Instance destroyed successfully. Mai 07 19:32:43 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lazy-loading 'numa_topology' on Instance uuid fbda5d5e-085d-499a-9b6c-5e8e388d5363 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:43 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:43 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:43 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.004s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:43 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:43 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Auditing locally available compute resources for devstack (node: devstack) {{(pid=86443) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:937}} Mai 07 19:32:43 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Attempting a stable device rescue Mai 07 19:32:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdc', 'type': 'disk'}}} {{(pid=86443) rescue /opt/stack/nova/nova/virt/libvirt/driver.py:4650}} Mai 07 19:32:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Instance directory exists: not creating {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5211}} Mai 07 19:32:43 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Creating image(s) Mai 07 19:32:43 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "/opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:43 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "/opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:43 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "/opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:43 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lazy-loading 'trusted_certs' on Instance uuid fbda5d5e-085d-499a-9b6c-5e8e388d5363 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Received event network-vif-unplugged-cd2e1acb-ff52-4bee-8bdb-556f1a50fd56 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] Acquiring lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] No waiting events found dispatching network-vif-unplugged-cd2e1acb-ff52-4bee-8bdb-556f1a50fd56 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Received event network-vif-unplugged-cd2e1acb-ff52-4bee-8bdb-556f1a50fd56 for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] No waiting events found dispatching network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:32:44 devstack nova-compute[86443]: WARNING nova.compute.manager [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received unexpected event network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b for instance with vm_state active and task_state rescuing. Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] No waiting events found dispatching network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:32:44 devstack nova-compute[86443]: WARNING nova.compute.manager [req-99c17397-b2fa-495c-b559-7249ba8494b9 req-8ce0be07-2592-4153-9363-332f6e543fd4 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received unexpected event network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b for instance with vm_state active and task_state rescuing. Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG nova.objects.base [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Object Instance lazy-loaded attributes: numa_topology,trusted_certs {{(pid=86443) wrapper /opt/stack/nova/nova/objects/base.py:136}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "0892e37d6bf1e03ed281439514dc02557271a960" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "0892e37d6bf1e03ed281439514dc02557271a960" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-203eed9e-1120-4ef1-bb56-56200a0771f3 req-ec52ba13-a04e-4f6f-a13a-648e6abcec08 service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Received event network-vif-deleted-cd2e1acb-ff52-4bee-8bdb-556f1a50fd56 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:32:44 devstack nova-compute[86443]: INFO nova.compute.manager [req-203eed9e-1120-4ef1-bb56-56200a0771f3 req-ec52ba13-a04e-4f6f-a13a-648e6abcec08 service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Neutron deleted interface cd2e1acb-ff52-4bee-8bdb-556f1a50fd56; detaching it from the instance and deleting it from the info cache Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-203eed9e-1120-4ef1-bb56-56200a0771f3 req-ec52ba13-a04e-4f6f-a13a-648e6abcec08 service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7f0ddc97-b43e-4a5a-8690-7583f53320f0/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7f0ddc97-b43e-4a5a-8690-7583f53320f0/disk --force-share --output=json" returned: 0 in 0.174s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7f0ddc97-b43e-4a5a-8690-7583f53320f0/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7f0ddc97-b43e-4a5a-8690-7583f53320f0/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:44 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:32:45 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-203eed9e-1120-4ef1-bb56-56200a0771f3 req-ec52ba13-a04e-4f6f-a13a-648e6abcec08 service nova] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Detach interface failed, port_id=cd2e1acb-ff52-4bee-8bdb-556f1a50fd56, reason: Instance a9e2b18d-247f-4143-831c-3f98b7be0ef8 could not be found. {{(pid=86443) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:11820}} Mai 07 19:32:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk --force-share --output=json" returned: 0 in 0.164s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:45 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] skipping disk /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22/volume-605a4e7c-0cad-4174-8367-46d132450223 (vdb) as it is a volume {{(pid=86443) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:12328}} Mai 07 19:32:45 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:32:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): env LANG=C uptime {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "env LANG=C uptime" returned: 0 in 0.036s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:45 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Hypervisor/Node resource view: name=devstack free_ram=5306MB free_disk=14.804450988769531GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_00_0", "address": "0000:02:00.0", "product_id": "000d", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000d", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1111", "vendor_id": "1234", "numa_node": null, "label": "label_1234_1111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1043", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1043", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] {{(pid=86443) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1136}} Mai 07 19:32:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:45 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Took 2.36 seconds to deallocate network for instance. Mai 07 19:32:45 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:32:45 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:32:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/0892e37d6bf1e03ed281439514dc02557271a960.part --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/0892e37d6bf1e03ed281439514dc02557271a960.part --force-share --output=json" returned: 0 in 0.110s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:45 devstack nova-compute[86443]: DEBUG nova.virt.images [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] 10f5aa2a-7a9d-4b37-b8a9-be1d4bcb55c2 was qcow2, converting to raw {{(pid=86443) fetch_to_raw /opt/stack/nova/nova/virt/images.py:278}} Mai 07 19:32:45 devstack nova-compute[86443]: DEBUG nova.privsep.utils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=86443) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Mai 07 19:32:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/0892e37d6bf1e03ed281439514dc02557271a960.part /opt/stack/data/nova/instances/_base/0892e37d6bf1e03ed281439514dc02557271a960.converted {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:46 devstack nova-compute[86443]: INFO nova.compute.manager [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Took 0.64 seconds to detach 1 volumes for instance. Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/0892e37d6bf1e03ed281439514dc02557271a960.part /opt/stack/data/nova/instances/_base/0892e37d6bf1e03ed281439514dc02557271a960.converted" returned: 0 in 0.282s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/0892e37d6bf1e03ed281439514dc02557271a960.converted --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Instance fbda5d5e-085d-499a-9b6c-5e8e388d5363 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=86443) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1740}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Instance 7f0ddc97-b43e-4a5a-8690-7583f53320f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=86443) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1740}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Instance a9e2b18d-247f-4143-831c-3f98b7be0ef8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=86443) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1740}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Total usable vcpus: 4, total allocated vcpus: 3 {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1159}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Final resource view: name=devstack phys_ram=11961MB used_ram=1088MB phys_disk=25GB used_disk=3GB total_vcpus=4 used_vcpus=3 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:32:45 up 32 min, 1 user, load average: 5.23, 3.29, 2.21\n', 'num_instances': '3', 'num_vm_active': '3', 'num_task_rescuing': '1', 'num_os_type_None': '3', 'num_proj_573feae7be914c46a57ac9575b2f8e5f': '1', 'io_workload': '1', 'num_task_None': '1', 'num_proj_ff7563f2fd6d456ca5a6feffe8a32082': '1', 'num_task_deleting': '1', 'num_proj_b03cc5ee6fc644bb93377fc5016aca8b': '1'} {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1168}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/0892e37d6bf1e03ed281439514dc02557271a960.converted --force-share --output=json" returned: 0 in 0.130s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "0892e37d6bf1e03ed281439514dc02557271a960" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 2.149s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "0892e37d6bf1e03ed281439514dc02557271a960" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "0892e37d6bf1e03ed281439514dc02557271a960" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/0892e37d6bf1e03ed281439514dc02557271a960 --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/0892e37d6bf1e03ed281439514dc02557271a960 --force-share --output=json" returned: 0 in 0.116s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/0892e37d6bf1e03ed281439514dc02557271a960,backing_fmt=raw /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk.rescue {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/0892e37d6bf1e03ed281439514dc02557271a960,backing_fmt=raw /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363/disk.rescue" returned: 0 in 0.047s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "0892e37d6bf1e03ed281439514dc02557271a960" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.175s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lazy-loading 'migration_context' on Instance uuid fbda5d5e-085d-499a-9b6c-5e8e388d5363 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:46 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.objects.base [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Object Instance lazy-loaded attributes: numa_topology,trusted_certs,migration_context {{(pid=86443) wrapper /opt/stack/nova/nova/objects/base.py:136}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Created local disks {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5347}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Start _get_guest_xml network_info=[{"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.153", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "vif_mac": "fa:16:3e:ca:f2:d2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdc', 'type': 'disk'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=) rescue={'image_id': '10f5aa2a-7a9d-4b37-b8a9-be1d4bcb55c2', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'image_id': 'e8633b10-b98a-4580-90f8-3091ca40fa29'}], 'ephemerals': [], 'block_device_mapping': [{'attachment_id': 'e56c1cdf-90f4-4dd0-bfc8-60b44d58691e', 'disk_bus': 'virtio', 'connection_info': {'driver_volume_type': 'quobyte', 'data': {'export': 'osci02.corp.quobyte.com/cinder-vol-1d971a4c-1ce1-46c7-a94d-347e695e16aa', 'name': 'volume-605a4e7c-0cad-4174-8367-46d132450223', 'options': None, 'format': 'qcow2', 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False, 'mount_point_base': '/opt/stack/data/cinder/mnt', 'enforce_multipath': True}, 'status': 'reserved', 'instance': 'fbda5d5e-085d-499a-9b6c-5e8e388d5363', 'attached_at': '', 'detached_at': '', 'volume_id': '605a4e7c-0cad-4174-8367-46d132450223', 'serial': '605a4e7c-0cad-4174-8367-46d132450223'}, 'guest_format': None, 'mount_device': '/dev/vdb', 'boot_index': None, 'device_type': 'disk', 'delete_on_termination': False, 'volume_type': None}], 'swap': None}share_info=ShareMappingList(objects=[]) {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8192}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lazy-loading 'resources' on Instance uuid fbda5d5e-085d-499a-9b6c-5e8e388d5363 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Compute_service record updated for devstack:devstack {{(pid=86443) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1097}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.286s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 60.00 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.906s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.objects.base [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Object Instance lazy-loaded attributes: numa_topology,trusted_certs,migration_context,resources {{(pid=86443) wrapper /opt/stack/nova/nova/objects/base.py:136}} Mai 07 19:32:47 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='e8633b10-b98a-4580-90f8-3091ca40fa29', instance_meta=NovaInstanceMeta(name='tempest-ServerStableDeviceRescueTest-server-1930754072', uuid='fbda5d5e-085d-499a-9b6c-5e8e388d5363'), owner=OwnerMeta(userid='47c66181a1ab44acb74977f56e0f3ca3', username='tempest-ServerStableDeviceRescueTest-2036085738-project-member', projectid='573feae7be914c46a57ac9575b2f8e5f', projectname='tempest-ServerStableDeviceRescueTest-2036085738'), image=ImageMeta(id='e8633b10-b98a-4580-90f8-3091ca40fa29', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'ide', 'hw_disk_bus': 'virtio', 'hw_machine_type': 'pc', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.153", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "vif_mac": "fa:16:3e:ca:f2:d2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175167.633787) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Searching host: 'devstack' for CPU controller through CGroups V1... {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1783}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CPU controller missing on host. {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1793}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Searching host: 'devstack' for CPU controller through CGroups V2... {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1802}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CPU controller found on host. {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1809}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Getting desirable topologies for flavor Flavor(created_at=2026-05-07T17:26:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=192,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=), allow threads: True {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:703}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Flavor limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:488}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Image limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:492}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Flavor pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:528}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Image pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:532}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:570}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:709}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:611}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Got 1 possible topologies {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:641}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:715}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:717}} Mai 07 19:32:47 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lazy-loading 'vcpu_model' on Instance uuid fbda5d5e-085d-499a-9b6c-5e8e388d5363 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:48 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:32:48 devstack nova-compute[86443]: DEBUG nova.objects.base [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Object Instance lazy-loaded attributes: numa_topology,trusted_certs,migration_context,resources,vcpu_model {{(pid=86443) wrapper /opt/stack/nova/nova/objects/base.py:136}} Mai 07 19:32:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:48 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.volume.quobyte [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] systemd detected. {{(pid=86443) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/quobyte.py:167}} Mai 07 19:32:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: held 0.006s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.147s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:48 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:31:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1930754072',display_name='tempest-ServerStableDeviceRescueTest-server-1930754072',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-serverstabledevicerescuetest-server-1930754072',id=1,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA0/3BWn/zzrBQnAkXdd1s2YFde8yarhfzAlXsa7tLa/iX4wVP25675chgq0wVKE1yt1vGqpfarn8QigLrDIoYheGujk2mCEoyKPhrMw8JZOvM12TXRkJHFDuOg8jZSc3g==',key_name='tempest-keypair-1700510776',keypairs=,launch_index=0,launched_at=2026-05-07T17:31:46Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='573feae7be914c46a57ac9575b2f8e5f',ramdisk_id='',reservation_id='r-lwlutd2e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerStableDeviceRescueTest-2036085738',owner_user_name='tempest-ServerStableDeviceRescueTest-2036085738-project-member'},tags=,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:32:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='47c66181a1ab44acb74977f56e0f3ca3',uuid=fbda5d5e-085d-499a-9b6c-5e8e388d5363,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.153", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "vif_mac": "fa:16:3e:ca:f2:d2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:32:48 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Converting VIF {"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.153", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "vif_mac": "fa:16:3e:ca:f2:d2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:32:48 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ca:f2:d2,bridge_name='br-int',has_traffic_filtering=True,id=4dc7dc14-7c9d-468b-941a-a17ba2f0390b,network=Network(2dbfb390-0203-4c02-95b8-b0404f525eaa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dc7dc14-7c') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:32:48 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lazy-loading 'pci_devices' on Instance uuid fbda5d5e-085d-499a-9b6c-5e8e388d5363 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:48 devstack nova-compute[86443]: INFO nova.scheduler.client.report [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Deleted allocations for instance a9e2b18d-247f-4143-831c-3f98b7be0ef8 Mai 07 19:32:49 devstack nova-compute[86443]: DEBUG nova.objects.base [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Object Instance lazy-loaded attributes: numa_topology,trusted_certs,migration_context,resources,vcpu_model,pci_devices {{(pid=86443) wrapper /opt/stack/nova/nova/objects/base.py:136}} Mai 07 19:32:49 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] End _get_guest_xml xml= Mai 07 19:32:49 devstack nova-compute[86443]: fbda5d5e-085d-499a-9b6c-5e8e388d5363 Mai 07 19:32:49 devstack nova-compute[86443]: instance-00000001 Mai 07 19:32:49 devstack nova-compute[86443]: 196608 Mai 07 19:32:49 devstack nova-compute[86443]: 1 Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: tempest-ServerStableDeviceRescueTest-server-1930754072 Mai 07 19:32:49 devstack nova-compute[86443]: 2026-05-07 17:32:47 Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: 192 Mai 07 19:32:49 devstack nova-compute[86443]: 1 Mai 07 19:32:49 devstack nova-compute[86443]: 0 Mai 07 19:32:49 devstack nova-compute[86443]: 0 Mai 07 19:32:49 devstack nova-compute[86443]: 1 Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: True Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: bare Mai 07 19:32:49 devstack nova-compute[86443]: qcow2 Mai 07 19:32:49 devstack nova-compute[86443]: 1 Mai 07 19:32:49 devstack nova-compute[86443]: 0 Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: ide Mai 07 19:32:49 devstack nova-compute[86443]: virtio Mai 07 19:32:49 devstack nova-compute[86443]: pc Mai 07 19:32:49 devstack nova-compute[86443]: virtio Mai 07 19:32:49 devstack nova-compute[86443]: virtio Mai 07 19:32:49 devstack nova-compute[86443]: virtio Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: tempest-ServerStableDeviceRescueTest-2036085738-project-member Mai 07 19:32:49 devstack nova-compute[86443]: tempest-ServerStableDeviceRescueTest-2036085738 Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: OpenStack Foundation Mai 07 19:32:49 devstack nova-compute[86443]: OpenStack Nova Mai 07 19:32:49 devstack nova-compute[86443]: 33.1.0 Mai 07 19:32:49 devstack nova-compute[86443]: fbda5d5e-085d-499a-9b6c-5e8e388d5363 Mai 07 19:32:49 devstack nova-compute[86443]: fbda5d5e-085d-499a-9b6c-5e8e388d5363 Mai 07 19:32:49 devstack nova-compute[86443]: Virtual Machine Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: hvm Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: 1 Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: 605a4e7c-0cad-4174-8367-46d132450223 Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: /dev/urandom Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8199}} Mai 07 19:32:49 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Instance destroyed successfully. Mai 07 19:32:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:49 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.volume.quobyte [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] systemd detected. {{(pid=86443) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/quobyte.py:167}} Mai 07 19:32:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: held 0.004s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:49 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Lazy-loading 'flavor' on Instance uuid 7f0ddc97-b43e-4a5a-8690-7583f53320f0 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e64165ca-7753-4bd6-a5cb-cd9b2868ba95 tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Lock "a9e2b18d-247f-4143-831c-3f98b7be0ef8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 8.674s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:49 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] attach device xml: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: f85ecce2-9e09-4c47-ba66-26a684a5cb5f Mai 07 19:32:49 devstack nova-compute[86443]: Mai 07 19:32:49 devstack nova-compute[86443]: {{(pid=86443) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:351}} Mai 07 19:32:50 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:32:50 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] No BDM found with device name vdb, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:32:50 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] No BDM found with device name vdc, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:32:50 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] No VIF found with MAC fa:16:3e:ca:f2:d2, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:32:50 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:50 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:51 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-f3b60f95-c0eb-4421-9fa7-55cb5f8dfdc2 req-27cdb1b5-fda2-4e83-8226-cd211c6b5a7e service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:32:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-f3b60f95-c0eb-4421-9fa7-55cb5f8dfdc2 req-27cdb1b5-fda2-4e83-8226-cd211c6b5a7e service nova] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-f3b60f95-c0eb-4421-9fa7-55cb5f8dfdc2 req-27cdb1b5-fda2-4e83-8226-cd211c6b5a7e service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.004s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-f3b60f95-c0eb-4421-9fa7-55cb5f8dfdc2 req-27cdb1b5-fda2-4e83-8226-cd211c6b5a7e service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:51 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-f3b60f95-c0eb-4421-9fa7-55cb5f8dfdc2 req-27cdb1b5-fda2-4e83-8226-cd211c6b5a7e service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] No waiting events found dispatching network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:32:51 devstack nova-compute[86443]: WARNING nova.compute.manager [req-f3b60f95-c0eb-4421-9fa7-55cb5f8dfdc2 req-27cdb1b5-fda2-4e83-8226-cd211c6b5a7e service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received unexpected event network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b for instance with vm_state active and task_state rescuing. Mai 07 19:32:51 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:32:51 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] No BDM found with device name vdb, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:32:51 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] No VIF found with MAC fa:16:3e:60:f7:a7, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:32:51 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:51 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:51 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:51 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:52 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:52 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Removed pending STOPPED event for fbda5d5e-085d-499a-9b6c-5e8e388d5363 due to new event Resumed> {{(pid=86443) _event_emit_delayed /opt/stack/nova/nova/virt/libvirt/host.py:520}} Mai 07 19:32:52 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Resumed> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:32:52 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] VM Resumed (Lifecycle Event) Mai 07 19:32:52 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-0a67b724-a897-46f3-932c-28f514086b5b tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:32:53 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:32:53 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:32:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-18e6ae30-fe5f-4b78-9b7b-6dfadcd188ec tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0" "released" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 11.860s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:53 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-439a7e1a-af06-49f1-908a-0f09f4b4eace req-39e2a869-2365-4232-a00d-f8f62c0f65ad service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:32:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-439a7e1a-af06-49f1-908a-0f09f4b4eace req-39e2a869-2365-4232-a00d-f8f62c0f65ad service nova] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-439a7e1a-af06-49f1-908a-0f09f4b4eace req-39e2a869-2365-4232-a00d-f8f62c0f65ad service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-439a7e1a-af06-49f1-908a-0f09f4b4eace req-39e2a869-2365-4232-a00d-f8f62c0f65ad service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:53 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-439a7e1a-af06-49f1-908a-0f09f4b4eace req-39e2a869-2365-4232-a00d-f8f62c0f65ad service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] No waiting events found dispatching network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:32:53 devstack nova-compute[86443]: WARNING nova.compute.manager [req-439a7e1a-af06-49f1-908a-0f09f4b4eace req-39e2a869-2365-4232-a00d-f8f62c0f65ad service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received unexpected event network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b for instance with vm_state rescued and task_state unrescuing. Mai 07 19:32:53 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Started> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:32:53 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] VM Started (Lifecycle Event) Mai 07 19:32:53 devstack nova-compute[86443]: INFO nova.compute.manager [None req-ad363b34-0aa3-4461-9e24-b046758a67e7 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Unrescuing Mai 07 19:32:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ad363b34-0aa3-4461-9e24-b046758a67e7 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:32:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ad363b34-0aa3-4461-9e24-b046758a67e7 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquired lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:32:53 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-ad363b34-0aa3-4461-9e24-b046758a67e7 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:32:54 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:32:54 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:32:54 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-e8e69ee0-7e88-4b36-9acb-3b7d1d4560b2 tempest-VolumesAssistedSnapshotsTest-1309444160 tempest-VolumesAssistedSnapshotsTest-1309444160-project] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] volume_snapshot_create: create_info: {'snapshot_id': '88cb0faa-1bbb-4a05-b62b-0d79d8dfc9d2', 'type': 'qcow2', 'new_file': 'new_file'} {{(pid=86443) volume_snapshot_create /opt/stack/nova/nova/virt/libvirt/driver.py:3769}} Mai 07 19:32:54 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-e8e69ee0-7e88-4b36-9acb-3b7d1d4560b2 tempest-VolumesAssistedSnapshotsTest-1309444160 tempest-VolumesAssistedSnapshotsTest-1309444160-project] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] snap xml: Mai 07 19:32:54 devstack nova-compute[86443]: Mai 07 19:32:54 devstack nova-compute[86443]: Mai 07 19:32:54 devstack nova-compute[86443]: Mai 07 19:32:54 devstack nova-compute[86443]: Mai 07 19:32:54 devstack nova-compute[86443]: Mai 07 19:32:54 devstack nova-compute[86443]: Mai 07 19:32:54 devstack nova-compute[86443]: Mai 07 19:32:54 devstack nova-compute[86443]: {{(pid=86443) _volume_snapshot_create /opt/stack/nova/nova/virt/libvirt/driver.py:3710}} Mai 07 19:32:54 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-e8e69ee0-7e88-4b36-9acb-3b7d1d4560b2 tempest-VolumesAssistedSnapshotsTest-1309444160 tempest-VolumesAssistedSnapshotsTest-1309444160-project] Lazy-loading 'system_metadata' on Instance uuid 7f0ddc97-b43e-4a5a-8690-7583f53320f0 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:54 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-ad363b34-0aa3-4461-9e24-b046758a67e7 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:54 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-b59e4966-45b5-4e81-a294-1ad0661b6621 tempest-VolumesAssistedSnapshotsTest-1309444160 tempest-VolumesAssistedSnapshotsTest-1309444160-project] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] volume_snapshot_delete: delete_info: {'volume_id': 'f85ecce2-9e09-4c47-ba66-26a684a5cb5f'} {{(pid=86443) _volume_snapshot_delete /opt/stack/nova/nova/virt/libvirt/driver.py:3877}} Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [None req-b59e4966-45b5-4e81-a294-1ad0661b6621 tempest-VolumesAssistedSnapshotsTest-1309444160 tempest-VolumesAssistedSnapshotsTest-1309444160-project] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Error occurred during volume_snapshot_delete, sending error status to Cinder.: KeyError: 'type' Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Traceback (most recent call last): Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 4050, in volume_snapshot_delete Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] self._volume_snapshot_delete(context, instance, volume_id, Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3880, in _volume_snapshot_delete Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] if delete_info['type'] != 'qcow2': Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] ~~~~~~~~~~~^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] KeyError: 'type' Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Mai 07 19:32:54 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] During sync_power_state the instance has a pending task (unrescuing). Skip. Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [None req-b59e4966-45b5-4e81-a294-1ad0661b6621 tempest-VolumesAssistedSnapshotsTest-1309444160 tempest-VolumesAssistedSnapshotsTest-1309444160-project] Failed to send updated snapshot status to volume service.: nova.exception.SnapshotNotFound: Snapshot None could not be found. Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver Traceback (most recent call last): Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 4050, in volume_snapshot_delete Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver self._volume_snapshot_delete(context, instance, volume_id, Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3880, in _volume_snapshot_delete Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver if delete_info['type'] != 'qcow2': Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ~~~~~~~~~~~^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver KeyError: 'type' Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred: Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver Traceback (most recent call last): Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 445, in wrapper Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver res = method(self, ctx, snapshot_id, *args, **kwargs) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 740, in update_snapshot_status Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver vs.update_snapshot_status( Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/v3/volume_snapshots.py", line 225, in update_snapshot_status Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver return self._action('os-update_snapshot_status', Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/v3/volume_snapshots.py", line 221, in _action Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver resp, body = self.api.client.post(url, body=body) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/client.py", line 223, in post Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver return self._cs_request(url, 'POST', **kwargs) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/client.py", line 211, in _cs_request Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver return self.request(url, method, **kwargs) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/client.py", line 197, in request Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver raise exceptions.from_response(resp, body) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver cinderclient.exceptions.NotFound: Snapshot None could not be found. (HTTP 404) (Request-ID: req-d5bd6f9f-5fcc-42ed-b360-d85cdf4dcb10) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred: Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver Traceback (most recent call last): Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3609, in _volume_snapshot_update_status Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver self._volume_api.update_snapshot_status(context, Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 376, in wrapper Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver res = method(self, ctx, *args, **kwargs) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 447, in wrapper Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver _reraise(exception.SnapshotNotFound(snapshot_id=snapshot_id)) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 467, in _reraise Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver raise desired_exc.with_traceback(sys.exc_info()[2]) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 445, in wrapper Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver res = method(self, ctx, snapshot_id, *args, **kwargs) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 740, in update_snapshot_status Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver vs.update_snapshot_status( Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/v3/volume_snapshots.py", line 225, in update_snapshot_status Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver return self._action('os-update_snapshot_status', Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/v3/volume_snapshots.py", line 221, in _action Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver resp, body = self.api.client.post(url, body=body) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/client.py", line 223, in post Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver return self._cs_request(url, 'POST', **kwargs) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/client.py", line 211, in _cs_request Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver return self.request(url, method, **kwargs) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/client.py", line 197, in request Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver raise exceptions.from_response(resp, body) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver nova.exception.SnapshotNotFound: Snapshot None could not be found. Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server [None req-b59e4966-45b5-4e81-a294-1ad0661b6621 tempest-VolumesAssistedSnapshotsTest-1309444160 tempest-VolumesAssistedSnapshotsTest-1309444160-project] Exception during message handling: KeyError: 'type' Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_messaging/rpc/server.py", line 174, in _process_incoming Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server ^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_messaging/rpc/server.py", line 269, in inner Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server return func(*args, **kwargs) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server ^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/excutils.py", line 271, in __exit__ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server self.force_reraise() Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/excutils.py", line 233, in force_reraise Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server raise self.value Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 4829, in volume_snapshot_delete Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server self.driver.volume_snapshot_delete(context, instance, volume_id, Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 4053, in volume_snapshot_delete Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/excutils.py", line 271, in __exit__ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server self.force_reraise() Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/excutils.py", line 233, in force_reraise Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server raise self.value Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 4050, in volume_snapshot_delete Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server self._volume_snapshot_delete(context, instance, volume_id, Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3880, in _volume_snapshot_delete Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server if delete_info['type'] != 'qcow2': Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server ~~~~~~~~~~~^^^^^^^^ Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server KeyError: 'type' Mai 07 19:32:54 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server Mai 07 19:32:54 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [-] Fixed interval looping call 'nova.servicegroup.drivers.db.DbDriver._report_state' sleeping for 119.47 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:32:54 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-e8e69ee0-7e88-4b36-9acb-3b7d1d4560b2 tempest-VolumesAssistedSnapshotsTest-1309444160 tempest-VolumesAssistedSnapshotsTest-1309444160-project] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Skipping quiescing instance: QEMU guest agent is not enabled. Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [None req-e8e69ee0-7e88-4b36-9acb-3b7d1d4560b2 tempest-VolumesAssistedSnapshotsTest-1309444160 tempest-VolumesAssistedSnapshotsTest-1309444160-project] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Unable to create VM snapshot, failing volume_snapshot operation.: libvirt.libvirtError: missing existing file for disk vdb: /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22/new_file: No such file or directory Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Traceback (most recent call last): Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3737, in _volume_snapshot_create Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] guest.snapshot(snapshot, no_metadata=True, disk_only=True, Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] File "/opt/stack/nova/nova/virt/libvirt/guest.py", line 583, in snapshot Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] self._domain.snapshotCreateXML(device_xml, flags=flags) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] File "/usr/lib/python3/dist-packages/libvirt.py", line 3125, in snapshotCreateXML Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] raise libvirtError('virDomainSnapshotCreateXML() failed') Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] libvirt.libvirtError: missing existing file for disk vdb: /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22/new_file: No such file or directory Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [None req-e8e69ee0-7e88-4b36-9acb-3b7d1d4560b2 tempest-VolumesAssistedSnapshotsTest-1309444160 tempest-VolumesAssistedSnapshotsTest-1309444160-project] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Error occurred during volume_snapshot_create, sending error status to Cinder.: libvirt.libvirtError: missing existing file for disk vdb: /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22/new_file: No such file or directory Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Traceback (most recent call last): Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3787, in volume_snapshot_create Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] self._volume_snapshot_create(context, instance, guest, Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3737, in _volume_snapshot_create Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] guest.snapshot(snapshot, no_metadata=True, disk_only=True, Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] File "/opt/stack/nova/nova/virt/libvirt/guest.py", line 583, in snapshot Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] self._domain.snapshotCreateXML(device_xml, flags=flags) Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] File "/usr/lib/python3/dist-packages/libvirt.py", line 3125, in snapshotCreateXML Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] raise libvirtError('virDomainSnapshotCreateXML() failed') Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] libvirt.libvirtError: missing existing file for disk vdb: /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22/new_file: No such file or directory Mai 07 19:32:54 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Mai 07 19:32:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver [None req-e8e69ee0-7e88-4b36-9acb-3b7d1d4560b2 tempest-VolumesAssistedSnapshotsTest-1309444160 tempest-VolumesAssistedSnapshotsTest-1309444160-project] Failed to send updated snapshot status to volume service.: nova.exception.SnapshotNotFound: Snapshot 88cb0faa-1bbb-4a05-b62b-0d79d8dfc9d2 could not be found. Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver Traceback (most recent call last): Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3787, in volume_snapshot_create Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver self._volume_snapshot_create(context, instance, guest, Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3737, in _volume_snapshot_create Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver guest.snapshot(snapshot, no_metadata=True, disk_only=True, Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/virt/libvirt/guest.py", line 583, in snapshot Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver self._domain.snapshotCreateXML(device_xml, flags=flags) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/usr/lib/python3/dist-packages/libvirt.py", line 3125, in snapshotCreateXML Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver raise libvirtError('virDomainSnapshotCreateXML() failed') Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver libvirt.libvirtError: missing existing file for disk vdb: /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22/new_file: No such file or directory Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred: Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver Traceback (most recent call last): Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 445, in wrapper Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver res = method(self, ctx, snapshot_id, *args, **kwargs) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 740, in update_snapshot_status Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver vs.update_snapshot_status( Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/v3/volume_snapshots.py", line 225, in update_snapshot_status Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver return self._action('os-update_snapshot_status', Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/v3/volume_snapshots.py", line 221, in _action Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver resp, body = self.api.client.post(url, body=body) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/client.py", line 223, in post Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver return self._cs_request(url, 'POST', **kwargs) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/client.py", line 211, in _cs_request Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver return self.request(url, method, **kwargs) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/client.py", line 197, in request Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver raise exceptions.from_response(resp, body) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver cinderclient.exceptions.NotFound: Snapshot 88cb0faa-1bbb-4a05-b62b-0d79d8dfc9d2 could not be found. (HTTP 404) (Request-ID: req-a30051e1-8789-40b1-aa5a-3a28094b65f2) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred: Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver Traceback (most recent call last): Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3609, in _volume_snapshot_update_status Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver self._volume_api.update_snapshot_status(context, Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 376, in wrapper Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver res = method(self, ctx, *args, **kwargs) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 447, in wrapper Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver _reraise(exception.SnapshotNotFound(snapshot_id=snapshot_id)) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 467, in _reraise Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver raise desired_exc.with_traceback(sys.exc_info()[2]) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 445, in wrapper Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver res = method(self, ctx, snapshot_id, *args, **kwargs) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 740, in update_snapshot_status Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver vs.update_snapshot_status( Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/v3/volume_snapshots.py", line 225, in update_snapshot_status Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver return self._action('os-update_snapshot_status', Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/v3/volume_snapshots.py", line 221, in _action Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver resp, body = self.api.client.post(url, body=body) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/client.py", line 223, in post Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver return self._cs_request(url, 'POST', **kwargs) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/client.py", line 211, in _cs_request Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver return self.request(url, method, **kwargs) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.12/site-packages/cinderclient/client.py", line 197, in request Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver raise exceptions.from_response(resp, body) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver nova.exception.SnapshotNotFound: Snapshot 88cb0faa-1bbb-4a05-b62b-0d79d8dfc9d2 could not be found. Mai 07 19:32:55 devstack nova-compute[86443]: ERROR nova.virt.libvirt.driver Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server [None req-e8e69ee0-7e88-4b36-9acb-3b7d1d4560b2 tempest-VolumesAssistedSnapshotsTest-1309444160 tempest-VolumesAssistedSnapshotsTest-1309444160-project] Exception during message handling: libvirt.libvirtError: missing existing file for disk vdb: /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22/new_file: No such file or directory Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_messaging/rpc/server.py", line 174, in _process_incoming Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server ^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_messaging/rpc/server.py", line 269, in inner Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server return func(*args, **kwargs) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server ^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/excutils.py", line 271, in __exit__ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server self.force_reraise() Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/excutils.py", line 233, in force_reraise Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server raise self.value Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 4817, in volume_snapshot_create Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server self.driver.volume_snapshot_create(context, instance, volume_id, Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3790, in volume_snapshot_create Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/excutils.py", line 271, in __exit__ Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server self.force_reraise() Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/excutils.py", line 233, in force_reraise Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server raise self.value Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3787, in volume_snapshot_create Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server self._volume_snapshot_create(context, instance, guest, Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3737, in _volume_snapshot_create Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server guest.snapshot(snapshot, no_metadata=True, disk_only=True, Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/guest.py", line 583, in snapshot Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server self._domain.snapshotCreateXML(device_xml, flags=flags) Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/libvirt.py", line 3125, in snapshotCreateXML Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server raise libvirtError('virDomainSnapshotCreateXML() failed') Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server libvirt.libvirtError: missing existing file for disk vdb: /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22/new_file: No such file or directory Mai 07 19:32:55 devstack nova-compute[86443]: ERROR oslo_messaging.rpc.server Mai 07 19:32:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Acquiring lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0" acquired by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:56 devstack nova-compute[86443]: INFO nova.compute.manager [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Detaching volume f85ecce2-9e09-4c47-ba66-26a684a5cb5f Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-48ae9357-5451-4bee-b24d-3ffabf781ea8 req-095878e7-6f70-4cf2-81e7-59e61999446c service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-changed-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-48ae9357-5451-4bee-b24d-3ffabf781ea8 req-095878e7-6f70-4cf2-81e7-59e61999446c service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Refreshing instance network info cache due to event network-changed-4dc7dc14-7c9d-468b-941a-a17ba2f0390b. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-48ae9357-5451-4bee-b24d-3ffabf781ea8 req-095878e7-6f70-4cf2-81e7-59e61999446c service nova] Acquiring lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:56 devstack nova-compute[86443]: INFO nova.virt.block_device [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Attempting to driver detach volume f85ecce2-9e09-4c47-ba66-26a684a5cb5f from mountpoint /dev/vdb Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Found disk vdb by alias ua-f85ecce2-9e09-4c47-ba66-26a684a5cb5f {{(pid=86443) _get_guest_disk_device /opt/stack/nova/nova/virt/libvirt/driver.py:2892}} Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Found disk vdb by alias ua-f85ecce2-9e09-4c47-ba66-26a684a5cb5f {{(pid=86443) _get_guest_disk_device /opt/stack/nova/nova/virt/libvirt/driver.py:2892}} Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Attempting to detach device vdb from instance 7f0ddc97-b43e-4a5a-8690-7583f53320f0 from the persistent domain config. {{(pid=86443) _detach_from_persistent /opt/stack/nova/nova/virt/libvirt/driver.py:2642}} Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] detach device xml: Mai 07 19:32:56 devstack nova-compute[86443]: Mai 07 19:32:56 devstack nova-compute[86443]: Mai 07 19:32:56 devstack nova-compute[86443]: Mai 07 19:32:56 devstack nova-compute[86443]: Mai 07 19:32:56 devstack nova-compute[86443]: f85ecce2-9e09-4c47-ba66-26a684a5cb5f Mai 07 19:32:56 devstack nova-compute[86443]:
Mai 07 19:32:56 devstack nova-compute[86443]: Mai 07 19:32:56 devstack nova-compute[86443]: {{(pid=86443) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:481}} Mai 07 19:32:56 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Successfully detached device vdb from instance 7f0ddc97-b43e-4a5a-8690-7583f53320f0 from the persistent domain config. Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] (1/8): Attempting to detach device vdb with device alias ua-f85ecce2-9e09-4c47-ba66-26a684a5cb5f from instance 7f0ddc97-b43e-4a5a-8690-7583f53320f0 from the live domain config. {{(pid=86443) _detach_from_live_with_retry /opt/stack/nova/nova/virt/libvirt/driver.py:2676}} Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] detach device xml: Mai 07 19:32:56 devstack nova-compute[86443]: Mai 07 19:32:56 devstack nova-compute[86443]: Mai 07 19:32:56 devstack nova-compute[86443]: Mai 07 19:32:56 devstack nova-compute[86443]: Mai 07 19:32:56 devstack nova-compute[86443]: f85ecce2-9e09-4c47-ba66-26a684a5cb5f Mai 07 19:32:56 devstack nova-compute[86443]:
Mai 07 19:32:56 devstack nova-compute[86443]: Mai 07 19:32:56 devstack nova-compute[86443]: {{(pid=86443) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:481}} Mai 07 19:32:56 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-ad363b34-0aa3-4461-9e24-b046758a67e7 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-ad363b34-0aa3-4461-9e24-b046758a67e7 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Updating instance_info_cache with network_info: [{"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.153", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Received event ua-f85ecce2-9e09-4c47-ba66-26a684a5cb5f> from libvirt while the driver is waiting for it; dispatched. {{(pid=86443) emit_event /opt/stack/nova/nova/virt/libvirt/driver.py:2529}} Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Start waiting for the detach event from libvirt for device vdb with device alias ua-f85ecce2-9e09-4c47-ba66-26a684a5cb5f for instance 7f0ddc97-b43e-4a5a-8690-7583f53320f0 {{(pid=86443) _detach_from_live_and_wait_for_event /opt/stack/nova/nova/virt/libvirt/driver.py:2756}} Mai 07 19:32:56 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Successfully detached device vdb from instance 7f0ddc97-b43e-4a5a-8690-7583f53320f0 from the live domain config. Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:32:56 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Couldn't unmount the Quobyte Volume at /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command. Mai 07 19:32:56 devstack nova-compute[86443]: Command: umount /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22 Mai 07 19:32:56 devstack nova-compute[86443]: Exit code: 32 Mai 07 19:32:56 devstack nova-compute[86443]: Stdout: '' Mai 07 19:32:56 devstack nova-compute[86443]: Stderr: 'umount: /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22: target is busy.\n' Mai 07 19:32:56 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Traceback (most recent call last): Mai 07 19:32:56 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte File "/opt/stack/nova/nova/virt/libvirt/volume/quobyte.py", line 96, in umount_volume Mai 07 19:32:56 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte nova.privsep.libvirt.umount(mnt_base) Mai 07 19:32:56 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_privsep/priv_context.py", line 315, in _wrap Mai 07 19:32:56 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte return self.channel.remote_call(name, args, kwargs, r_call_timeout) Mai 07 19:32:56 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:32:56 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_privsep/daemon.py", line 262, in remote_call Mai 07 19:32:56 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte raise exc_type(*result[2]) Mai 07 19:32:56 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command. Mai 07 19:32:56 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Command: umount /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22 Mai 07 19:32:56 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Exit code: 32 Mai 07 19:32:56 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Stdout: '' Mai 07 19:32:56 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Stderr: 'umount: /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22: target is busy.\n' Mai 07 19:32:56 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: held 0.024s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG nova.utils [-] Task(fn=>, remaining_delay=-0.00658596099992792 future=) submitted to {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG nova.utils [-] Waiting for the next task {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG nova.utils [-] Received Task(fn=>, remaining_delay=1.1865784679998796 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG nova.utils [-] Task(fn=>, remaining_delay=1.186318098999891 future=) was cancelled while queued, skipping {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Emitting event Stopped> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:32:56 devstack nova-compute[86443]: DEBUG nova.utils [-] Waiting for the next task {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:32:56 devstack nova-compute[86443]: INFO nova.compute.manager [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] VM Stopped (Lifecycle Event) Mai 07 19:32:57 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ad363b34-0aa3-4461-9e24-b046758a67e7 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Releasing lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:32:57 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-ad363b34-0aa3-4461-9e24-b046758a67e7 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lazy-loading 'flavor' on Instance uuid fbda5d5e-085d-499a-9b6c-5e8e388d5363 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:57 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-48ae9357-5451-4bee-b24d-3ffabf781ea8 req-095878e7-6f70-4cf2-81e7-59e61999446c service nova] Acquired lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:32:57 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-48ae9357-5451-4bee-b24d-3ffabf781ea8 req-095878e7-6f70-4cf2-81e7-59e61999446c service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Refreshing network info cache for port 4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:32:57 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: a9e2b18d-247f-4143-831c-3f98b7be0ef8] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:32:57 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:57 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Lazy-loading 'flavor' on Instance uuid 7f0ddc97-b43e-4a5a-8690-7583f53320f0 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:57 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:57 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-48ae9357-5451-4bee-b24d-3ffabf781ea8 req-095878e7-6f70-4cf2-81e7-59e61999446c service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:57 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:57 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:57 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:58 devstack nova-compute[86443]: DEBUG nova.utils [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Queued Task(fn=>, remaining_delay=14.999395174000028 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:32:58 devstack nova-compute[86443]: DEBUG nova.utils [-] Received Task(fn=>, remaining_delay=14.994633671999964 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:32:58 devstack nova-compute[86443]: DEBUG nova.utils [-] Waitig for the deadline of Task(fn=>, remaining_delay=14.992625594999936 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:32:58 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Instance destroyed successfully. Mai 07 19:32:58 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-ad363b34-0aa3-4461-9e24-b046758a67e7 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lazy-loading 'numa_topology' on Instance uuid fbda5d5e-085d-499a-9b6c-5e8e388d5363 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:32:58 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-48ae9357-5451-4bee-b24d-3ffabf781ea8 req-095878e7-6f70-4cf2-81e7-59e61999446c service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:58 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-d94aeac8-3bb0-401a-8ce1-aeb4d97dc65f req-eeb97818-d5dc-44a4-b9ad-d8dff53bf1a0 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-changed-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:32:58 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-d94aeac8-3bb0-401a-8ce1-aeb4d97dc65f req-eeb97818-d5dc-44a4-b9ad-d8dff53bf1a0 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Refreshing instance network info cache due to event network-changed-4dc7dc14-7c9d-468b-941a-a17ba2f0390b. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:32:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-d94aeac8-3bb0-401a-8ce1-aeb4d97dc65f req-eeb97818-d5dc-44a4-b9ad-d8dff53bf1a0 service nova] Acquiring lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:32:58 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-48ae9357-5451-4bee-b24d-3ffabf781ea8 req-095878e7-6f70-4cf2-81e7-59e61999446c service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Updated VIF entry in instance network info cache for port 4dc7dc14-7c9d-468b-941a-a17ba2f0390b. {{(pid=86443) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3521}} Mai 07 19:32:58 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-48ae9357-5451-4bee-b24d-3ffabf781ea8 req-095878e7-6f70-4cf2-81e7-59e61999446c service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Updating instance_info_cache with network_info: [{"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.153", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:32:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-59b856e9-9613-4c4b-98d8-36ad19bf9da3 tempest-VolumesAssistedSnapshotsTest-802678434 tempest-VolumesAssistedSnapshotsTest-802678434-project-admin] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0" "released" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: held 2.768s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:32:58 devstack nova-compute[86443]: DEBUG nova.objects.base [None req-ad363b34-0aa3-4461-9e24-b046758a67e7 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Object Instance lazy-loaded attributes: flavor,numa_topology {{(pid=86443) wrapper /opt/stack/nova/nova/objects/base.py:136}} Mai 07 19:32:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-48ae9357-5451-4bee-b24d-3ffabf781ea8 req-095878e7-6f70-4cf2-81e7-59e61999446c service nova] Releasing lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:32:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-d94aeac8-3bb0-401a-8ce1-aeb4d97dc65f req-eeb97818-d5dc-44a4-b9ad-d8dff53bf1a0 service nova] Acquired lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:32:59 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-d94aeac8-3bb0-401a-8ce1-aeb4d97dc65f req-eeb97818-d5dc-44a4-b9ad-d8dff53bf1a0 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Refreshing network info cache for port 4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:32:59 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-d94aeac8-3bb0-401a-8ce1-aeb4d97dc65f req-eeb97818-d5dc-44a4-b9ad-d8dff53bf1a0 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:32:59 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:59 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:59 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:32:59 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:00 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-d94aeac8-3bb0-401a-8ce1-aeb4d97dc65f req-eeb97818-d5dc-44a4-b9ad-d8dff53bf1a0 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:33:00 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Removed pending STOPPED event for fbda5d5e-085d-499a-9b6c-5e8e388d5363 due to new event Resumed> {{(pid=86443) _event_emit_delayed /opt/stack/nova/nova/virt/libvirt/host.py:520}} Mai 07 19:33:00 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Resumed> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:33:00 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] VM Resumed (Lifecycle Event) Mai 07 19:33:00 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-ad363b34-0aa3-4461-9e24-b046758a67e7 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:33:00 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-20102203-acc3-4f6f-aadc-d24e1a4407ae req-66def496-e0d5-4f09-ac39-ab5b575958d5 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:33:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-20102203-acc3-4f6f-aadc-d24e1a4407ae req-66def496-e0d5-4f09-ac39-ab5b575958d5 service nova] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-20102203-acc3-4f6f-aadc-d24e1a4407ae req-66def496-e0d5-4f09-ac39-ab5b575958d5 service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-20102203-acc3-4f6f-aadc-d24e1a4407ae req-66def496-e0d5-4f09-ac39-ab5b575958d5 service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:00 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-20102203-acc3-4f6f-aadc-d24e1a4407ae req-66def496-e0d5-4f09-ac39-ab5b575958d5 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] No waiting events found dispatching network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:33:00 devstack nova-compute[86443]: WARNING nova.compute.manager [req-20102203-acc3-4f6f-aadc-d24e1a4407ae req-66def496-e0d5-4f09-ac39-ab5b575958d5 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received unexpected event network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b for instance with vm_state rescued and task_state unrescuing. Mai 07 19:33:00 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-20102203-acc3-4f6f-aadc-d24e1a4407ae req-66def496-e0d5-4f09-ac39-ab5b575958d5 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:33:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-20102203-acc3-4f6f-aadc-d24e1a4407ae req-66def496-e0d5-4f09-ac39-ab5b575958d5 service nova] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-20102203-acc3-4f6f-aadc-d24e1a4407ae req-66def496-e0d5-4f09-ac39-ab5b575958d5 service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-20102203-acc3-4f6f-aadc-d24e1a4407ae req-66def496-e0d5-4f09-ac39-ab5b575958d5 service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:00 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-20102203-acc3-4f6f-aadc-d24e1a4407ae req-66def496-e0d5-4f09-ac39-ab5b575958d5 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] No waiting events found dispatching network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:33:00 devstack nova-compute[86443]: WARNING nova.compute.manager [req-20102203-acc3-4f6f-aadc-d24e1a4407ae req-66def496-e0d5-4f09-ac39-ab5b575958d5 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received unexpected event network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b for instance with vm_state rescued and task_state unrescuing. Mai 07 19:33:00 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-d94aeac8-3bb0-401a-8ce1-aeb4d97dc65f req-eeb97818-d5dc-44a4-b9ad-d8dff53bf1a0 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Updated VIF entry in instance network info cache for port 4dc7dc14-7c9d-468b-941a-a17ba2f0390b. {{(pid=86443) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3521}} Mai 07 19:33:00 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-d94aeac8-3bb0-401a-8ce1-aeb4d97dc65f req-eeb97818-d5dc-44a4-b9ad-d8dff53bf1a0 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Updating instance_info_cache with network_info: [{"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.153", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:33:01 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:33:01 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:33:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-d94aeac8-3bb0-401a-8ce1-aeb4d97dc65f req-eeb97818-d5dc-44a4-b9ad-d8dff53bf1a0 service nova] Releasing lock "refresh_cache-fbda5d5e-085d-499a-9b6c-5e8e388d5363" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:33:01 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-d94aeac8-3bb0-401a-8ce1-aeb4d97dc65f req-eeb97818-d5dc-44a4-b9ad-d8dff53bf1a0 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:33:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-d94aeac8-3bb0-401a-8ce1-aeb4d97dc65f req-eeb97818-d5dc-44a4-b9ad-d8dff53bf1a0 service nova] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-d94aeac8-3bb0-401a-8ce1-aeb4d97dc65f req-eeb97818-d5dc-44a4-b9ad-d8dff53bf1a0 service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-d94aeac8-3bb0-401a-8ce1-aeb4d97dc65f req-eeb97818-d5dc-44a4-b9ad-d8dff53bf1a0 service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:01 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-d94aeac8-3bb0-401a-8ce1-aeb4d97dc65f req-eeb97818-d5dc-44a4-b9ad-d8dff53bf1a0 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] No waiting events found dispatching network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:33:01 devstack nova-compute[86443]: WARNING nova.compute.manager [req-d94aeac8-3bb0-401a-8ce1-aeb4d97dc65f req-eeb97818-d5dc-44a4-b9ad-d8dff53bf1a0 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received unexpected event network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b for instance with vm_state rescued and task_state unrescuing. Mai 07 19:33:01 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:01 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Started> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:33:01 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] VM Started (Lifecycle Event) Mai 07 19:33:02 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:33:02 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:33:02 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:02 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-078862ff-87b1-41da-8997-8b43585041c2 req-bd0fd1cd-80c8-475c-a866-d5a7b91a5ff9 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:33:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-078862ff-87b1-41da-8997-8b43585041c2 req-bd0fd1cd-80c8-475c-a866-d5a7b91a5ff9 service nova] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-078862ff-87b1-41da-8997-8b43585041c2 req-bd0fd1cd-80c8-475c-a866-d5a7b91a5ff9 service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-078862ff-87b1-41da-8997-8b43585041c2 req-bd0fd1cd-80c8-475c-a866-d5a7b91a5ff9 service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:02 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-078862ff-87b1-41da-8997-8b43585041c2 req-bd0fd1cd-80c8-475c-a866-d5a7b91a5ff9 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] No waiting events found dispatching network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:33:02 devstack nova-compute[86443]: WARNING nova.compute.manager [req-078862ff-87b1-41da-8997-8b43585041c2 req-bd0fd1cd-80c8-475c-a866-d5a7b91a5ff9 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received unexpected event network-vif-plugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b for instance with vm_state active and task_state None. Mai 07 19:33:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Acquiring lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Acquiring lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:03 devstack nova-compute[86443]: INFO nova.compute.manager [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Terminating instance Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Start destroying the instance on the hypervisor. {{(pid=86443) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3332}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG nova.utils [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Queued Task(fn=>, remaining_delay=14.999453002999871 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:33:04 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Instance destroyed successfully. Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lazy-loading 'resources' on Instance uuid 7f0ddc97-b43e-4a5a-8690-7583f53320f0 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-efd45c3a-b0ac-4eea-9574-67da595f1521 req-80f3088c-a944-4abd-bc7a-e2da462e8484 service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Received event network-vif-unplugged-ffa25775-a078-439b-bc5a-4aa8ef9b3b2b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-efd45c3a-b0ac-4eea-9574-67da595f1521 req-80f3088c-a944-4abd-bc7a-e2da462e8484 service nova] Acquiring lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-efd45c3a-b0ac-4eea-9574-67da595f1521 req-80f3088c-a944-4abd-bc7a-e2da462e8484 service nova] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-efd45c3a-b0ac-4eea-9574-67da595f1521 req-80f3088c-a944-4abd-bc7a-e2da462e8484 service nova] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-efd45c3a-b0ac-4eea-9574-67da595f1521 req-80f3088c-a944-4abd-bc7a-e2da462e8484 service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] No waiting events found dispatching network-vif-unplugged-ffa25775-a078-439b-bc5a-4aa8ef9b3b2b {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-efd45c3a-b0ac-4eea-9574-67da595f1521 req-80f3088c-a944-4abd-bc7a-e2da462e8484 service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Received event network-vif-unplugged-ffa25775-a078-439b-bc5a-4aa8ef9b3b2b for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:31:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAssistedSnapshotsTest-server-367025004',display_name='tempest-VolumesAssistedSnapshotsTest-server-367025004',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-volumesassistedsnapshotstest-server-367025004',id=2,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO4w5hPBxKmd3vLt1iJD44reCQ0uqFrZUztfteqXGJNl11aaESOInYHsQrEQQB/y1JeJv2kdN3MUlgyyHmeLwK31PKytWfSeRKn35rXXESwYXLSIqlni7xjR7PTsUT6ptA==',key_name='tempest-keypair-1707075065',keypairs=,launch_index=0,launched_at=2026-05-07T17:32:14Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='ff7563f2fd6d456ca5a6feffe8a32082',ramdisk_id='',reservation_id='r-c5m0tai2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAssistedSnapshotsTest-937399013',owner_user_name='tempest-VolumesAssistedSnapshotsTest-937399013-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2026-05-07T17:32:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='f3230ea9529545f48fe6fafdb05a882b',uuid=7f0ddc97-b43e-4a5a-8690-7583f53320f0,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "address": "fa:16:3e:60:f7:a7", "network": {"id": "727cabe8-9053-4d06-9f7e-70dbb412fab9", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-375743393-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.56", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff7563f2fd6d456ca5a6feffe8a32082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffa25775-a0", "ovs_interfaceid": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:881}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Converting VIF {"id": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "address": "fa:16:3e:60:f7:a7", "network": {"id": "727cabe8-9053-4d06-9f7e-70dbb412fab9", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-375743393-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.56", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff7563f2fd6d456ca5a6feffe8a32082", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffa25775-a0", "ovs_interfaceid": "ffa25775-a078-439b-bc5a-4aa8ef9b3b2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:60:f7:a7,bridge_name='br-int',has_traffic_filtering=True,id=ffa25775-a078-439b-bc5a-4aa8ef9b3b2b,network=Network(727cabe8-9053-4d06-9f7e-70dbb412fab9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffa25775-a0') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG os_vif [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:f7:a7,bridge_name='br-int',has_traffic_filtering=True,id=ffa25775-a078-439b-bc5a-4aa8ef9b3b2b,network=Network(727cabe8-9053-4d06-9f7e-70dbb412fab9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffa25775-a0') {{(pid=86443) unplug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:109}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapffa25775-a0, bridge=br-int, if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=457e5a1b-5b55-494a-920c-890b8e5620c9) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:33:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:04 devstack nova-compute[86443]: INFO os_vif [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:f7:a7,bridge_name='br-int',has_traffic_filtering=True,id=ffa25775-a078-439b-bc5a-4aa8ef9b3b2b,network=Network(727cabe8-9053-4d06-9f7e-70dbb412fab9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffa25775-a0') Mai 07 19:33:04 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Deleting instance files /opt/stack/data/nova/instances/7f0ddc97-b43e-4a5a-8690-7583f53320f0_del Mai 07 19:33:04 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Deletion of /opt/stack/data/nova/instances/7f0ddc97-b43e-4a5a-8690-7583f53320f0_del complete Mai 07 19:33:05 devstack nova-compute[86443]: INFO nova.compute.manager [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Took 1.37 seconds to destroy the instance on the hypervisor. Mai 07 19:33:05 devstack nova-compute[86443]: DEBUG oslo.service.backend._common.loopingcall [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=86443) func /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_common/loopingcall.py:419}} Mai 07 19:33:05 devstack nova-compute[86443]: DEBUG nova.compute.manager [-] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Deallocating network for instance {{(pid=86443) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2456}} Mai 07 19:33:05 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] deallocate_for_instance() {{(pid=86443) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1842}} Mai 07 19:33:05 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:33:05 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:33:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:06 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-989f1df9-3670-4074-b653-2f34ed4175b4 req-6f625914-dc62-4819-a359-5f82cc345896 service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Received event network-vif-unplugged-ffa25775-a078-439b-bc5a-4aa8ef9b3b2b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:33:06 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-989f1df9-3670-4074-b653-2f34ed4175b4 req-6f625914-dc62-4819-a359-5f82cc345896 service nova] Acquiring lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:06 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-989f1df9-3670-4074-b653-2f34ed4175b4 req-6f625914-dc62-4819-a359-5f82cc345896 service nova] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:06 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-989f1df9-3670-4074-b653-2f34ed4175b4 req-6f625914-dc62-4819-a359-5f82cc345896 service nova] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:06 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-989f1df9-3670-4074-b653-2f34ed4175b4 req-6f625914-dc62-4819-a359-5f82cc345896 service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] No waiting events found dispatching network-vif-unplugged-ffa25775-a078-439b-bc5a-4aa8ef9b3b2b {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:33:06 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-989f1df9-3670-4074-b653-2f34ed4175b4 req-6f625914-dc62-4819-a359-5f82cc345896 service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Received event network-vif-unplugged-ffa25775-a078-439b-bc5a-4aa8ef9b3b2b for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:33:07 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:33:07 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Took 2.43 seconds to deallocate network for instance. Mai 07 19:33:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:08 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:33:08 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:33:08 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-f10e86ae-f006-4f2a-b49a-9591d6415d51 req-50030255-1c01-4af8-a972-4e448618b794 service nova] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Received event network-vif-deleted-ffa25775-a078-439b-bc5a-4aa8ef9b3b2b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:33:09 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:33:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:09 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.177s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:09 devstack nova-compute[86443]: INFO nova.scheduler.client.report [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Deleted allocations for instance 7f0ddc97-b43e-4a5a-8690-7583f53320f0 Mai 07 19:33:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:10 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8b0234e1-3145-4623-87b6-00c1da6556a8 tempest-VolumesAssistedSnapshotsTest-937399013 tempest-VolumesAssistedSnapshotsTest-937399013-project-member] Lock "7f0ddc97-b43e-4a5a-8690-7583f53320f0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 7.146s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:13 devstack nova-compute[86443]: DEBUG nova.utils [-] Task(fn=>, remaining_delay=-0.002832668999872112 future=) was cancelled during its delay period, skipping {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:33:13 devstack nova-compute[86443]: DEBUG nova.utils [-] Waiting for the next task {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:33:13 devstack nova-compute[86443]: DEBUG nova.utils [-] Received Task(fn=>, remaining_delay=6.168174335000003 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:33:13 devstack nova-compute[86443]: DEBUG nova.utils [-] Waitig for the deadline of Task(fn=>, remaining_delay=6.167856696999934 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:33:14 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:14 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:19 devstack nova-compute[86443]: DEBUG nova.utils [-] Task(fn=>, remaining_delay=-0.004567800000131683 future=) submitted to {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:33:19 devstack nova-compute[86443]: DEBUG nova.utils [-] Waiting for the next task {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:33:19 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Emitting event Stopped> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:33:19 devstack nova-compute[86443]: INFO nova.compute.manager [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] VM Stopped (Lifecycle Event) Mai 07 19:33:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:19 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: 7f0ddc97-b43e-4a5a-8690-7583f53320f0] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:33:19 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:19 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Starting instance... {{(pid=86443) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2605}} Mai 07 19:33:20 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:20 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:20 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=86443) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2630}} Mai 07 19:33:20 devstack nova-compute[86443]: INFO nova.compute.claims [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Claim successful on node devstack Mai 07 19:33:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:21 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:21 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:33:21 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:33:22 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:33:22 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.237s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:22 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Start building networks asynchronously for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3003}} Mai 07 19:33:23 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Allocating IP information in the background. {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2148}} Mai 07 19:33:23 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] allocate_for_instance() {{(pid=86443) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1187}} Mai 07 19:33:23 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:33:23 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:33:23 devstack nova-compute[86443]: DEBUG nova.policy [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93c67b1b99f94c92bc013de9e4ea5a50', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c334113f19ec44068e5f9d6c5b26596d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=86443) authorize /opt/stack/nova/nova/policy.py:192}} Mai 07 19:33:23 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Mai 07 19:33:24 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Successfully created port: 6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:529}} Mai 07 19:33:24 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Start building block device mappings for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3038}} Mai 07 19:33:24 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Start spawning the instance on the hypervisor. {{(pid=86443) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2811}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Creating instance directory {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5215}} Mai 07 19:33:25 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Creating image(s) Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "/opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "/opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "/opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.132s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "d8d56ca44922efe85609619d01052c20f44c056a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.142s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk 1073741824 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk 1073741824" returned: 0 in 0.068s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.228s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.185s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Checking if we can resize image /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk. size=1073741824 {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:178}} Mai 07 19:33:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:33:26 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Successfully updated port: 6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) _update_port /opt/stack/nova/nova/network/neutron.py:567}} Mai 07 19:33:26 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:33:26 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Cannot resize image /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk to a smaller size. {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:184}} Mai 07 19:33:26 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Created local disks {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5347}} Mai 07 19:33:26 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Ensure instance console log exists: /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/console.log {{(pid=86443) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:5094}} Mai 07 19:33:26 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:26 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:26 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:26 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-d5aaf34a-86d3-420c-9106-4ff813d7b04e req-982369db-1514-45f8-8465-5943456b2543 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received event network-changed-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:33:26 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-d5aaf34a-86d3-420c-9106-4ff813d7b04e req-982369db-1514-45f8-8465-5943456b2543 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Refreshing instance network info cache due to event network-changed-6c46fa4b-1cd3-4216-b294-254a49b6191c. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:33:26 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-d5aaf34a-86d3-420c-9106-4ff813d7b04e req-982369db-1514-45f8-8465-5943456b2543 service nova] Acquiring lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:33:26 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-d5aaf34a-86d3-420c-9106-4ff813d7b04e req-982369db-1514-45f8-8465-5943456b2543 service nova] Acquired lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:33:26 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-d5aaf34a-86d3-420c-9106-4ff813d7b04e req-982369db-1514-45f8-8465-5943456b2543 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Refreshing network info cache for port 6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:33:26 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:26 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:33:26 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-d5aaf34a-86d3-420c-9106-4ff813d7b04e req-982369db-1514-45f8-8465-5943456b2543 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:33:26 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-d5aaf34a-86d3-420c-9106-4ff813d7b04e req-982369db-1514-45f8-8465-5943456b2543 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:33:26 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-d5aaf34a-86d3-420c-9106-4ff813d7b04e req-982369db-1514-45f8-8465-5943456b2543 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:33:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-d5aaf34a-86d3-420c-9106-4ff813d7b04e req-982369db-1514-45f8-8465-5943456b2543 service nova] Releasing lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:33:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquired lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:33:27 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:33:27 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:33:28 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Updating instance_info_cache with network_info: [{"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Releasing lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Instance network_info: |[{"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2163}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Start _get_guest_xml network_info=[{"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'image_id': 'e8633b10-b98a-4580-90f8-3091ca40fa29'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8192}} Mai 07 19:33:28 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='e8633b10-b98a-4580-90f8-3091ca40fa29', instance_meta=NovaInstanceMeta(name='tempest-AttachVolumeShelveTestJSON-server-273934295', uuid='462f0fe5-72f4-445d-8e8d-b9fd3a9c0735'), owner=OwnerMeta(userid='93c67b1b99f94c92bc013de9e4ea5a50', username='tempest-AttachVolumeShelveTestJSON-60761633-project-member', projectid='c334113f19ec44068e5f9d6c5b26596d', projectname='tempest-AttachVolumeShelveTestJSON-60761633'), image=ImageMeta(id='e8633b10-b98a-4580-90f8-3091ca40fa29', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175208.9724772) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Searching host: 'devstack' for CPU controller through CGroups V1... {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1783}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CPU controller missing on host. {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1793}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Searching host: 'devstack' for CPU controller through CGroups V2... {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1802}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CPU controller found on host. {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1809}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Getting desirable topologies for flavor Flavor(created_at=2026-05-07T17:26:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=192,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=), allow threads: True {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:703}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Flavor limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:488}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Image limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:492}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Flavor pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:528}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Image pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:532}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:570}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:709}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:611}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Got 1 possible topologies {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:641}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:715}} Mai 07 19:33:28 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:717}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:33:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-273934295',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumeshelvetestjson-server-273934295',id=4,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN0qxKGECoNEbYjRYyg9J6eBpQKjyUPTn/Y7GjZ2OQpxNDk8Jn4i4KnODTp8uyJYXXcinrgx2Ov8HE7RRLhSdzJb3J8GVC/uK4BHogxKpbjjNPR4YlXryX9vIW+FBnT0og==',key_name='tempest-keypair-1661872501',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c334113f19ec44068e5f9d6c5b26596d',ramdisk_id='',reservation_id='r-3921qkt1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-60761633',owner_user_name='tempest-AttachVolumeShelveTestJSON-60761633-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:33:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93c67b1b99f94c92bc013de9e4ea5a50',uuid=462f0fe5-72f4-445d-8e8d-b9fd3a9c0735,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Converting VIF {"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:e1:f1,bridge_name='br-int',has_traffic_filtering=True,id=6c46fa4b-1cd3-4216-b294-254a49b6191c,network=Network(fd6aab17-0e41-43d3-8659-62f185eed00e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46fa4b-1c') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lazy-loading 'pci_devices' on Instance uuid 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] End _get_guest_xml xml= Mai 07 19:33:29 devstack nova-compute[86443]: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 Mai 07 19:33:29 devstack nova-compute[86443]: instance-00000004 Mai 07 19:33:29 devstack nova-compute[86443]: 196608 Mai 07 19:33:29 devstack nova-compute[86443]: 1 Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: tempest-AttachVolumeShelveTestJSON-server-273934295 Mai 07 19:33:29 devstack nova-compute[86443]: 2026-05-07 17:33:28 Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: 192 Mai 07 19:33:29 devstack nova-compute[86443]: 1 Mai 07 19:33:29 devstack nova-compute[86443]: 0 Mai 07 19:33:29 devstack nova-compute[86443]: 0 Mai 07 19:33:29 devstack nova-compute[86443]: 1 Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: True Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: bare Mai 07 19:33:29 devstack nova-compute[86443]: qcow2 Mai 07 19:33:29 devstack nova-compute[86443]: 1 Mai 07 19:33:29 devstack nova-compute[86443]: 0 Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: virtio Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: tempest-AttachVolumeShelveTestJSON-60761633-project-member Mai 07 19:33:29 devstack nova-compute[86443]: tempest-AttachVolumeShelveTestJSON-60761633 Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: OpenStack Foundation Mai 07 19:33:29 devstack nova-compute[86443]: OpenStack Nova Mai 07 19:33:29 devstack nova-compute[86443]: 33.1.0 Mai 07 19:33:29 devstack nova-compute[86443]: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 Mai 07 19:33:29 devstack nova-compute[86443]: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 Mai 07 19:33:29 devstack nova-compute[86443]: Virtual Machine Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: hvm Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: 1 Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: /dev/urandom Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: Mai 07 19:33:29 devstack nova-compute[86443]: {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8199}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Preparing to wait for external event network-vif-plugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:306}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:33:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-273934295',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumeshelvetestjson-server-273934295',id=4,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN0qxKGECoNEbYjRYyg9J6eBpQKjyUPTn/Y7GjZ2OQpxNDk8Jn4i4KnODTp8uyJYXXcinrgx2Ov8HE7RRLhSdzJb3J8GVC/uK4BHogxKpbjjNPR4YlXryX9vIW+FBnT0og==',key_name='tempest-keypair-1661872501',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c334113f19ec44068e5f9d6c5b26596d',ramdisk_id='',reservation_id='r-3921qkt1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-60761633',owner_user_name='tempest-AttachVolumeShelveTestJSON-60761633-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:33:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93c67b1b99f94c92bc013de9e4ea5a50',uuid=462f0fe5-72f4-445d-8e8d-b9fd3a9c0735,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) plug /opt/stack/nova/nova/virt/libvirt/vif.py:763}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Converting VIF {"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:e1:f1,bridge_name='br-int',has_traffic_filtering=True,id=6c46fa4b-1cd3-4216-b294-254a49b6191c,network=Network(fd6aab17-0e41-43d3-8659-62f185eed00e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46fa4b-1c') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG os_vif [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:e1:f1,bridge_name='br-int',has_traffic_filtering=True,id=6c46fa4b-1cd3-4216-b294-254a49b6191c,network=Network(fd6aab17-0e41-43d3-8659-62f185eed00e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46fa4b-1c') {{(pid=86443) plug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:76}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'e236f7a0-d75a-596d-a889-a7a8395353a4', '_type': 'linux-noop'}}, row=False) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c46fa4b-1c, may_exist=True, interface_attrs={}) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap6c46fa4b-1c, col_values=(('qos', UUID('203eec11-447b-4ce1-bee7-8ce3990e23d1')),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap6c46fa4b-1c, col_values=(('external_ids', {'iface-id': '6c46fa4b-1cd3-4216-b294-254a49b6191c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:e1:f1', 'vm-uuid': '462f0fe5-72f4-445d-8e8d-b9fd3a9c0735'}),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:33:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:29 devstack nova-compute[86443]: INFO os_vif [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:e1:f1,bridge_name='br-int',has_traffic_filtering=True,id=6c46fa4b-1cd3-4216-b294-254a49b6191c,network=Network(fd6aab17-0e41-43d3-8659-62f185eed00e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46fa4b-1c') Mai 07 19:33:31 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:33:31 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] No VIF found with MAC fa:16:3e:eb:e1:f1, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:33:31 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:31 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:31 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:31 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-46c29da9-182a-4998-9424-71f496c80baf req-689229be-6659-4ef8-b318-1fd00a0a3846 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received event network-vif-plugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:33:31 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-46c29da9-182a-4998-9424-71f496c80baf req-689229be-6659-4ef8-b318-1fd00a0a3846 service nova] Acquiring lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:31 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-46c29da9-182a-4998-9424-71f496c80baf req-689229be-6659-4ef8-b318-1fd00a0a3846 service nova] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:31 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-46c29da9-182a-4998-9424-71f496c80baf req-689229be-6659-4ef8-b318-1fd00a0a3846 service nova] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:31 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-46c29da9-182a-4998-9424-71f496c80baf req-689229be-6659-4ef8-b318-1fd00a0a3846 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Processing event network-vif-plugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11746}} Mai 07 19:33:32 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:32 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:32 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:32 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:32 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:32 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=86443) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:601}} Mai 07 19:33:32 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Started> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:33:32 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] VM Started (Lifecycle Event) Mai 07 19:33:32 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Guest created on hypervisor {{(pid=86443) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4893}} Mai 07 19:33:32 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Instance spawned successfully. Mai 07 19:33:32 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1012}} Mai 07 19:33:33 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:33:33 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:33:33 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Found default for hw_cdrom_bus of ide {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:33:33 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Found default for hw_disk_bus of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:33:33 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Found default for hw_input_bus of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:33:33 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Found default for hw_pointer_model of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:33:33 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Found default for hw_video_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:33:33 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Found default for hw_vif_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:33:33 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:33 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" acquired by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:33 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] During sync_power_state the instance has a pending task (spawning). Skip. Mai 07 19:33:33 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Paused> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:33:33 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] VM Paused (Lifecycle Event) Mai 07 19:33:33 devstack nova-compute[86443]: INFO nova.compute.manager [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Took 8.44 seconds to spawn the instance on the hypervisor. Mai 07 19:33:33 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-d8b7a260-e5b7-4631-a043-d642ff153f87 req-76bbca2b-e29d-4c01-97b3-56c030e9a37f service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received event network-vif-plugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-d8b7a260-e5b7-4631-a043-d642ff153f87 req-76bbca2b-e29d-4c01-97b3-56c030e9a37f service nova] Acquiring lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-d8b7a260-e5b7-4631-a043-d642ff153f87 req-76bbca2b-e29d-4c01-97b3-56c030e9a37f service nova] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-d8b7a260-e5b7-4631-a043-d642ff153f87 req-76bbca2b-e29d-4c01-97b3-56c030e9a37f service nova] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-d8b7a260-e5b7-4631-a043-d642ff153f87 req-76bbca2b-e29d-4c01-97b3-56c030e9a37f service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] No waiting events found dispatching network-vif-plugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:33:34 devstack nova-compute[86443]: WARNING nova.compute.manager [req-d8b7a260-e5b7-4631-a043-d642ff153f87 req-76bbca2b-e29d-4c01-97b3-56c030e9a37f service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received unexpected event network-vif-plugged-6c46fa4b-1cd3-4216-b294-254a49b6191c for instance with vm_state active and task_state None. Mai 07 19:33:34 devstack nova-compute[86443]: INFO nova.compute.manager [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Detaching volume 605a4e7c-0cad-4174-8367-46d132450223 Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Resumed> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:33:34 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] VM Resumed (Lifecycle Event) Mai 07 19:33:34 devstack nova-compute[86443]: INFO nova.compute.manager [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Took 13.82 seconds to build instance. Mai 07 19:33:34 devstack nova-compute[86443]: INFO nova.virt.block_device [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Attempting to driver detach volume 605a4e7c-0cad-4174-8367-46d132450223 from mountpoint /dev/vdb Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Found disk vdb by alias ua-605a4e7c-0cad-4174-8367-46d132450223 {{(pid=86443) _get_guest_disk_device /opt/stack/nova/nova/virt/libvirt/driver.py:2892}} Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Found disk vdb by alias ua-605a4e7c-0cad-4174-8367-46d132450223 {{(pid=86443) _get_guest_disk_device /opt/stack/nova/nova/virt/libvirt/driver.py:2892}} Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Attempting to detach device vdb from instance fbda5d5e-085d-499a-9b6c-5e8e388d5363 from the persistent domain config. {{(pid=86443) _detach_from_persistent /opt/stack/nova/nova/virt/libvirt/driver.py:2642}} Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] detach device xml: Mai 07 19:33:34 devstack nova-compute[86443]: Mai 07 19:33:34 devstack nova-compute[86443]: Mai 07 19:33:34 devstack nova-compute[86443]: Mai 07 19:33:34 devstack nova-compute[86443]: Mai 07 19:33:34 devstack nova-compute[86443]: 605a4e7c-0cad-4174-8367-46d132450223 Mai 07 19:33:34 devstack nova-compute[86443]:
Mai 07 19:33:34 devstack nova-compute[86443]: Mai 07 19:33:34 devstack nova-compute[86443]: {{(pid=86443) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:481}} Mai 07 19:33:34 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Successfully detached device vdb from instance fbda5d5e-085d-499a-9b6c-5e8e388d5363 from the persistent domain config. Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] (1/8): Attempting to detach device vdb with device alias ua-605a4e7c-0cad-4174-8367-46d132450223 from instance fbda5d5e-085d-499a-9b6c-5e8e388d5363 from the live domain config. {{(pid=86443) _detach_from_live_with_retry /opt/stack/nova/nova/virt/libvirt/driver.py:2676}} Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] detach device xml: Mai 07 19:33:34 devstack nova-compute[86443]: Mai 07 19:33:34 devstack nova-compute[86443]: Mai 07 19:33:34 devstack nova-compute[86443]: Mai 07 19:33:34 devstack nova-compute[86443]: Mai 07 19:33:34 devstack nova-compute[86443]: 605a4e7c-0cad-4174-8367-46d132450223 Mai 07 19:33:34 devstack nova-compute[86443]:
Mai 07 19:33:34 devstack nova-compute[86443]: Mai 07 19:33:34 devstack nova-compute[86443]: {{(pid=86443) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:481}} Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Start waiting for the detach event from libvirt for device vdb with device alias ua-605a4e7c-0cad-4174-8367-46d132450223 for instance fbda5d5e-085d-499a-9b6c-5e8e388d5363 {{(pid=86443) _detach_from_live_and_wait_for_event /opt/stack/nova/nova/virt/libvirt/driver.py:2756}} Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:33:34 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.341s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:35 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Received event ua-605a4e7c-0cad-4174-8367-46d132450223> from libvirt while the driver is waiting for it; dispatched. {{(pid=86443) emit_event /opt/stack/nova/nova/virt/libvirt/driver.py:2529}} Mai 07 19:33:35 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Successfully detached device vdb from instance fbda5d5e-085d-499a-9b6c-5e8e388d5363 from the live domain config. Mai 07 19:33:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: held 0.046s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:36 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lazy-loading 'flavor' on Instance uuid fbda5d5e-085d-499a-9b6c-5e8e388d5363 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:33:36 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:37 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-f6a49b04-2616-40a8-991d-1da737539b89 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" "released" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: held 3.451s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:37 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-9a3dc653-8113-4483-8f57-725e6f78d5db req-bc252e8f-0135-406b-94a5-968e19c1c81a service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received event network-changed-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:33:37 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-9a3dc653-8113-4483-8f57-725e6f78d5db req-bc252e8f-0135-406b-94a5-968e19c1c81a service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Refreshing instance network info cache due to event network-changed-6c46fa4b-1cd3-4216-b294-254a49b6191c. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:33:37 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-9a3dc653-8113-4483-8f57-725e6f78d5db req-bc252e8f-0135-406b-94a5-968e19c1c81a service nova] Acquiring lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:33:37 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-9a3dc653-8113-4483-8f57-725e6f78d5db req-bc252e8f-0135-406b-94a5-968e19c1c81a service nova] Acquired lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:33:37 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-9a3dc653-8113-4483-8f57-725e6f78d5db req-bc252e8f-0135-406b-94a5-968e19c1c81a service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Refreshing network info cache for port 6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:33:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:37 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-9a3dc653-8113-4483-8f57-725e6f78d5db req-bc252e8f-0135-406b-94a5-968e19c1c81a service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:33:38 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:38 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:38 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:38 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:38 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:38 devstack nova-compute[86443]: INFO nova.compute.manager [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Terminating instance Mai 07 19:33:38 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Start destroying the instance on the hypervisor. {{(pid=86443) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3332}} Mai 07 19:33:38 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-9a3dc653-8113-4483-8f57-725e6f78d5db req-bc252e8f-0135-406b-94a5-968e19c1c81a service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:33:38 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:38 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:38 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:38 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:38 devstack nova-compute[86443]: DEBUG nova.utils [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Queued Task(fn=>, remaining_delay=14.999471161000201 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:33:38 devstack nova-compute[86443]: DEBUG nova.utils [-] Received Task(fn=>, remaining_delay=14.99248423500012 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:33:38 devstack nova-compute[86443]: DEBUG nova.utils [-] Waitig for the deadline of Task(fn=>, remaining_delay=14.992161289000023 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:33:38 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Instance destroyed successfully. Mai 07 19:33:38 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lazy-loading 'resources' on Instance uuid fbda5d5e-085d-499a-9b6c-5e8e388d5363 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-9a3dc653-8113-4483-8f57-725e6f78d5db req-bc252e8f-0135-406b-94a5-968e19c1c81a service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Updated VIF entry in instance network info cache for port 6c46fa4b-1cd3-4216-b294-254a49b6191c. {{(pid=86443) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3521}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-9a3dc653-8113-4483-8f57-725e6f78d5db req-bc252e8f-0135-406b-94a5-968e19c1c81a service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Updating instance_info_cache with network_info: [{"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.68", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:31:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1930754072',display_name='tempest-ServerStableDeviceRescueTest-server-1930754072',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-serverstabledevicerescuetest-server-1930754072',id=1,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA0/3BWn/zzrBQnAkXdd1s2YFde8yarhfzAlXsa7tLa/iX4wVP25675chgq0wVKE1yt1vGqpfarn8QigLrDIoYheGujk2mCEoyKPhrMw8JZOvM12TXRkJHFDuOg8jZSc3g==',key_name='tempest-keypair-1700510776',keypairs=,launch_index=0,launched_at=2026-05-07T17:32:52Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='573feae7be914c46a57ac9575b2f8e5f',ramdisk_id='',reservation_id='r-lwlutd2e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerStableDeviceRescueTest-2036085738',owner_user_name='tempest-ServerStableDeviceRescueTest-2036085738-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2026-05-07T17:33:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='47c66181a1ab44acb74977f56e0f3ca3',uuid=fbda5d5e-085d-499a-9b6c-5e8e388d5363,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.153", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:881}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Converting VIF {"id": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "address": "fa:16:3e:ca:f2:d2", "network": {"id": "2dbfb390-0203-4c02-95b8-b0404f525eaa", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-164541673-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.153", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "573feae7be914c46a57ac9575b2f8e5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dc7dc14-7c", "ovs_interfaceid": "4dc7dc14-7c9d-468b-941a-a17ba2f0390b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ca:f2:d2,bridge_name='br-int',has_traffic_filtering=True,id=4dc7dc14-7c9d-468b-941a-a17ba2f0390b,network=Network(2dbfb390-0203-4c02-95b8-b0404f525eaa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dc7dc14-7c') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG os_vif [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:f2:d2,bridge_name='br-int',has_traffic_filtering=True,id=4dc7dc14-7c9d-468b-941a-a17ba2f0390b,network=Network(2dbfb390-0203-4c02-95b8-b0404f525eaa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dc7dc14-7c') {{(pid=86443) unplug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:109}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4dc7dc14-7c, bridge=br-int, if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=87cb7963-5e3f-490b-9fe3-9e46b812bc5f) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:39 devstack nova-compute[86443]: INFO os_vif [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:f2:d2,bridge_name='br-int',has_traffic_filtering=True,id=4dc7dc14-7c9d-468b-941a-a17ba2f0390b,network=Network(2dbfb390-0203-4c02-95b8-b0404f525eaa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dc7dc14-7c') Mai 07 19:33:39 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Deleting instance files /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363_del Mai 07 19:33:39 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Deletion of /opt/stack/data/nova/instances/fbda5d5e-085d-499a-9b6c-5e8e388d5363_del complete Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-9a3dc653-8113-4483-8f57-725e6f78d5db req-bc252e8f-0135-406b-94a5-968e19c1c81a service nova] Releasing lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-5f969ce3-1e78-4774-b825-db40e0a8c650 req-db5dae09-7724-4604-8cbb-9f513d39b348 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-5f969ce3-1e78-4774-b825-db40e0a8c650 req-db5dae09-7724-4604-8cbb-9f513d39b348 service nova] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-5f969ce3-1e78-4774-b825-db40e0a8c650 req-db5dae09-7724-4604-8cbb-9f513d39b348 service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-5f969ce3-1e78-4774-b825-db40e0a8c650 req-db5dae09-7724-4604-8cbb-9f513d39b348 service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.007s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-5f969ce3-1e78-4774-b825-db40e0a8c650 req-db5dae09-7724-4604-8cbb-9f513d39b348 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] No waiting events found dispatching network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:33:39 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-5f969ce3-1e78-4774-b825-db40e0a8c650 req-db5dae09-7724-4604-8cbb-9f513d39b348 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:33:40 devstack nova-compute[86443]: INFO nova.compute.manager [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Took 1.42 seconds to destroy the instance on the hypervisor. Mai 07 19:33:40 devstack nova-compute[86443]: DEBUG oslo.service.backend._common.loopingcall [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=86443) func /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_common/loopingcall.py:419}} Mai 07 19:33:40 devstack nova-compute[86443]: DEBUG nova.compute.manager [-] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Deallocating network for instance {{(pid=86443) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2456}} Mai 07 19:33:40 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] deallocate_for_instance() {{(pid=86443) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1842}} Mai 07 19:33:40 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:33:40 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:33:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:41 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-90ca053a-afa7-4a10-b728-083a6d8a7049 req-f9a6266e-2276-4da4-a51f-f634955b1461 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-vif-deleted-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:33:41 devstack nova-compute[86443]: INFO nova.compute.manager [req-90ca053a-afa7-4a10-b728-083a6d8a7049 req-f9a6266e-2276-4da4-a51f-f634955b1461 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Neutron deleted interface 4dc7dc14-7c9d-468b-941a-a17ba2f0390b; detaching it from the instance and deleting it from the info cache Mai 07 19:33:41 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-90ca053a-afa7-4a10-b728-083a6d8a7049 req-f9a6266e-2276-4da4-a51f-f634955b1461 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:33:41 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-1b351a97-2f0b-4c43-9a2b-407773478c34 req-ae4b762d-def2-44b1-ba65-3631e58f9a3d service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:33:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-1b351a97-2f0b-4c43-9a2b-407773478c34 req-ae4b762d-def2-44b1-ba65-3631e58f9a3d service nova] Acquiring lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-1b351a97-2f0b-4c43-9a2b-407773478c34 req-ae4b762d-def2-44b1-ba65-3631e58f9a3d service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-1b351a97-2f0b-4c43-9a2b-407773478c34 req-ae4b762d-def2-44b1-ba65-3631e58f9a3d service nova] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.005s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:41 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-1b351a97-2f0b-4c43-9a2b-407773478c34 req-ae4b762d-def2-44b1-ba65-3631e58f9a3d service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] No waiting events found dispatching network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:33:41 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-1b351a97-2f0b-4c43-9a2b-407773478c34 req-ae4b762d-def2-44b1-ba65-3631e58f9a3d service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Received event network-vif-unplugged-4dc7dc14-7c9d-468b-941a-a17ba2f0390b for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:33:41 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:33:42 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-90ca053a-afa7-4a10-b728-083a6d8a7049 req-f9a6266e-2276-4da4-a51f-f634955b1461 service nova] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Detach interface failed, port_id=4dc7dc14-7c9d-468b-941a-a17ba2f0390b, reason: Instance fbda5d5e-085d-499a-9b6c-5e8e388d5363 could not be found. {{(pid=86443) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:11820}} Mai 07 19:33:42 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Took 2.44 seconds to deallocate network for instance. Mai 07 19:33:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:33:43 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:33:43 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:33:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.155s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:44 devstack nova-compute[86443]: INFO nova.scheduler.client.report [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Deleted allocations for instance fbda5d5e-085d-499a-9b6c-5e8e388d5363 Mai 07 19:33:44 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:44 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-4ee4eadf-244c-421f-8578-acc4c0f5fac2 tempest-ServerStableDeviceRescueTest-2036085738 tempest-ServerStableDeviceRescueTest-2036085738-project-member] Lock "fbda5d5e-085d-499a-9b6c-5e8e388d5363" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 7.188s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:46 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:47 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:33:47 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:33:47 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:33:47 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:33:47 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:33:47 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:33:47 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:33:47 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=86443) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:11402}} Mai 07 19:33:47 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager.update_available_resource {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:33:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:48 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Auditing locally available compute resources for devstack (node: devstack) {{(pid=86443) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:937}} Mai 07 19:33:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:33:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk --force-share --output=json" returned: 0 in 0.222s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:33:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:33:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk --force-share --output=json" returned: 0 in 0.167s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:33:49 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:33:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): env LANG=C uptime {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:33:49 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "env LANG=C uptime" returned: 0 in 0.032s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:33:49 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Hypervisor/Node resource view: name=devstack free_ram=5234MB free_disk=14.830142974853516GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_00_0", "address": "0000:02:00.0", "product_id": "000d", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000d", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1111", "vendor_id": "1234", "numa_node": null, "label": "label_1234_1111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1043", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1043", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] {{(pid=86443) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1136}} Mai 07 19:33:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:33:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:33:50 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Instance 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=86443) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1740}} Mai 07 19:33:50 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Total usable vcpus: 4, total allocated vcpus: 1 {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1159}} Mai 07 19:33:50 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Final resource view: name=devstack phys_ram=11961MB used_ram=704MB phys_disk=25GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:33:49 up 33 min, 1 user, load average: 6.11, 3.91, 2.50\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_c334113f19ec44068e5f9d6c5b26596d': '1', 'io_workload': '0'} {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1168}} Mai 07 19:33:50 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:33:50 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:33:51 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:33:51 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:51 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:51 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Compute_service record updated for devstack:devstack {{(pid=86443) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1097}} Mai 07 19:33:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.250s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:33:51 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 57.58 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:33:51 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:53 devstack nova-compute[86443]: DEBUG nova.utils [-] Task(fn=>, remaining_delay=-0.0037277139999787323 future=) submitted to {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:33:53 devstack nova-compute[86443]: DEBUG nova.utils [-] Waiting for the next task {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:33:53 devstack nova-compute[86443]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:33:53 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] VM Stopped (Lifecycle Event) Mai 07 19:33:54 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-79bb32b8-9c1b-4ab6-8756-2941ab937166 None None] [instance: fbda5d5e-085d-499a-9b6c-5e8e388d5363] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:33:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:56 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:56 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:57 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:33:59 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:01 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:01 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" by "nova.compute.manager.ComputeManager.shelve_instance..do_shelve_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" acquired by "nova.compute.manager.ComputeManager.shelve_instance..do_shelve_instance" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:02 devstack nova-compute[86443]: INFO nova.compute.manager [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Shelving Mai 07 19:34:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:04 devstack nova-compute[86443]: DEBUG nova.utils [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Queued Task(fn=>, remaining_delay=14.99934293299998 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:34:04 devstack nova-compute[86443]: DEBUG nova.utils [-] Received Task(fn=>, remaining_delay=14.990936272999988 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:34:04 devstack nova-compute[86443]: DEBUG nova.utils [-] Waitig for the deadline of Task(fn=>, remaining_delay=14.990600475999827 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:34:04 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Instance destroyed successfully. Mai 07 19:34:04 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lazy-loading 'numa_topology' on Instance uuid 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:34:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:04 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-33f0bb20-face-4e7b-a115-0af644727a16 req-4ac4c0fb-1b02-4121-8931-9252c5efccf1 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received event network-vif-unplugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-33f0bb20-face-4e7b-a115-0af644727a16 req-4ac4c0fb-1b02-4121-8931-9252c5efccf1 service nova] Acquiring lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-33f0bb20-face-4e7b-a115-0af644727a16 req-4ac4c0fb-1b02-4121-8931-9252c5efccf1 service nova] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-33f0bb20-face-4e7b-a115-0af644727a16 req-4ac4c0fb-1b02-4121-8931-9252c5efccf1 service nova] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:04 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-33f0bb20-face-4e7b-a115-0af644727a16 req-4ac4c0fb-1b02-4121-8931-9252c5efccf1 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] No waiting events found dispatching network-vif-unplugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:34:04 devstack nova-compute[86443]: WARNING nova.compute.manager [req-33f0bb20-face-4e7b-a115-0af644727a16 req-4ac4c0fb-1b02-4121-8931-9252c5efccf1 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received unexpected event network-vif-unplugged-6c46fa4b-1cd3-4216-b294-254a49b6191c for instance with vm_state active and task_state shelving. Mai 07 19:34:04 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Beginning cold snapshot process Mai 07 19:34:06 devstack nova-compute[86443]: DEBUG nova.privsep.utils [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=86443) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Mai 07 19:34:06 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk /opt/stack/data/nova/instances/snapshots/tmp7xdbshr_/d81ceed930c44218b848b2b60cc21371 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:06 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk /opt/stack/data/nova/instances/snapshots/tmp7xdbshr_/d81ceed930c44218b848b2b60cc21371" returned: 0 in 0.353s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:06 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Snapshot extracted, beginning image upload Mai 07 19:34:06 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-6c155f18-962f-4662-ba9f-4cf91a9027c3 req-8c740958-1cc7-42b9-9625-4463fe37a7ae service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received event network-vif-unplugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:06 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-6c155f18-962f-4662-ba9f-4cf91a9027c3 req-8c740958-1cc7-42b9-9625-4463fe37a7ae service nova] Acquiring lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:06 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-6c155f18-962f-4662-ba9f-4cf91a9027c3 req-8c740958-1cc7-42b9-9625-4463fe37a7ae service nova] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:06 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-6c155f18-962f-4662-ba9f-4cf91a9027c3 req-8c740958-1cc7-42b9-9625-4463fe37a7ae service nova] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:06 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-6c155f18-962f-4662-ba9f-4cf91a9027c3 req-8c740958-1cc7-42b9-9625-4463fe37a7ae service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] No waiting events found dispatching network-vif-unplugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:34:06 devstack nova-compute[86443]: WARNING nova.compute.manager [req-6c155f18-962f-4662-ba9f-4cf91a9027c3 req-8c740958-1cc7-42b9-9625-4463fe37a7ae service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received unexpected event network-vif-unplugged-6c46fa4b-1cd3-4216-b294-254a49b6191c for instance with vm_state active and task_state shelving_image_uploading. Mai 07 19:34:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:09 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Acquiring lock "2f733e5d-791b-4963-a6f4-4e64ea8da505" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:09 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:10 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Starting instance... {{(pid=86443) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2605}} Mai 07 19:34:10 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:10 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:10 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=86443) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2630}} Mai 07 19:34:10 devstack nova-compute[86443]: INFO nova.compute.claims [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Claim successful on node devstack Mai 07 19:34:11 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Snapshot image upload complete Mai 07 19:34:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:11 devstack nova-compute[86443]: INFO nova.compute.manager [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Shelve offloading Mai 07 19:34:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:11 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:34:11 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:34:12 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Instance destroyed successfully. Mai 07 19:34:12 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:34:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquired lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:34:12 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:34:12 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Starting instance... {{(pid=86443) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2605}} Mai 07 19:34:12 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:34:12 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.189s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:12 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Start building networks asynchronously for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3003}} Mai 07 19:34:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.023s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:12 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=86443) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2630}} Mai 07 19:34:12 devstack nova-compute[86443]: INFO nova.compute.claims [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Claim successful on node devstack Mai 07 19:34:13 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:13 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Allocating IP information in the background. {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2148}} Mai 07 19:34:13 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] allocate_for_instance() {{(pid=86443) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1187}} Mai 07 19:34:13 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:13 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:13 devstack nova-compute[86443]: DEBUG nova.policy [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6175af70f2de4ecdb264e34fbc50978b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd215b3694bec42629ea172bb6d560623', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=86443) authorize /opt/stack/nova/nova/policy.py:192}} Mai 07 19:34:13 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Updating instance_info_cache with network_info: [{"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.68", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:34:13 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Ignoring supplied device name: /dev/sda. Libvirt can't honour user-supplied dev names Mai 07 19:34:13 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:34:13 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:34:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Releasing lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:34:14 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:14 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Start building block device mappings for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3038}} Mai 07 19:34:14 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:34:14 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:14 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Successfully created port: 6f6f7733-b69d-4c99-94fd-93b769a37ed8 {{(pid=86443) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:529}} Mai 07 19:34:15 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.218s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:15 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Start building networks asynchronously for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3003}} Mai 07 19:34:15 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Start spawning the instance on the hypervisor. {{(pid=86443) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2811}} Mai 07 19:34:15 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Creating instance directory {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5215}} Mai 07 19:34:15 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Creating image(s) Mai 07 19:34:15 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Acquiring lock "/opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:15 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "/opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:15 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "/opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:15 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Acquiring lock "ebe5afcf74d424367d81a76637d00b51ba3dae92" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:15 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "ebe5afcf74d424367d81a76637d00b51ba3dae92" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:15 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Allocating IP information in the background. {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2148}} Mai 07 19:34:15 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] allocate_for_instance() {{(pid=86443) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1187}} Mai 07 19:34:15 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:15 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:15 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Instance destroyed successfully. Mai 07 19:34:15 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lazy-loading 'resources' on Instance uuid 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:34:15 devstack nova-compute[86443]: DEBUG nova.policy [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e7648a0472014bc2be2325afb69590b9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f9f94b6740934688aeea63198166dda4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=86443) authorize /opt/stack/nova/nova/policy.py:192}} Mai 07 19:34:16 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-545bb2a5-a61e-45bb-8cc7-a8d8b5f9d50f req-ffe91853-c94b-42d0-bba9-a96f823f547c service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received event network-changed-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-545bb2a5-a61e-45bb-8cc7-a8d8b5f9d50f req-ffe91853-c94b-42d0-bba9-a96f823f547c service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Refreshing instance network info cache due to event network-changed-6c46fa4b-1cd3-4216-b294-254a49b6191c. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-545bb2a5-a61e-45bb-8cc7-a8d8b5f9d50f req-ffe91853-c94b-42d0-bba9-a96f823f547c service nova] Acquiring lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-545bb2a5-a61e-45bb-8cc7-a8d8b5f9d50f req-ffe91853-c94b-42d0-bba9-a96f823f547c service nova] Acquired lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-545bb2a5-a61e-45bb-8cc7-a8d8b5f9d50f req-ffe91853-c94b-42d0-bba9-a96f823f547c service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Refreshing network info cache for port 6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG nova.objects.base [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Object Instance<462f0fe5-72f4-445d-8e8d-b9fd3a9c0735> lazy-loaded attributes: numa_topology,resources {{(pid=86443) wrapper /opt/stack/nova/nova/objects/base.py:136}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:33:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-273934295',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumeshelvetestjson-server-273934295',id=4,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN0qxKGECoNEbYjRYyg9J6eBpQKjyUPTn/Y7GjZ2OQpxNDk8Jn4i4KnODTp8uyJYXXcinrgx2Ov8HE7RRLhSdzJb3J8GVC/uK4BHogxKpbjjNPR4YlXryX9vIW+FBnT0og==',key_name='tempest-keypair-1661872501',keypairs=,launch_index=0,launched_at=2026-05-07T17:33:33Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='c334113f19ec44068e5f9d6c5b26596d',ramdisk_id='',reservation_id='r-3921qkt1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-60761633',owner_user_name='tempest-AttachVolumeShelveTestJSON-60761633-project-member',shelved_at='2026-05-07T17:34:11.156251',shelved_host='devstack',shelved_image_id='131a06d3-e6a9-43eb-96ca-9d76f6c2b34b'},tags=,task_state='shelving_offloading',terminated_at=None,trusted_certs=,updated_at=2026-05-07T17:34:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93c67b1b99f94c92bc013de9e4ea5a50',uuid=462f0fe5-72f4-445d-8e8d-b9fd3a9c0735,vcpu_model=,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.68", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:881}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Converting VIF {"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.68", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:e1:f1,bridge_name='br-int',has_traffic_filtering=True,id=6c46fa4b-1cd3-4216-b294-254a49b6191c,network=Network(fd6aab17-0e41-43d3-8659-62f185eed00e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46fa4b-1c') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG os_vif [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:e1:f1,bridge_name='br-int',has_traffic_filtering=True,id=6c46fa4b-1cd3-4216-b294-254a49b6191c,network=Network(fd6aab17-0e41-43d3-8659-62f185eed00e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46fa4b-1c') {{(pid=86443) unplug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:109}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c46fa4b-1c, bridge=br-int, if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Successfully updated port: 6f6f7733-b69d-4c99-94fd-93b769a37ed8 {{(pid=86443) _update_port /opt/stack/nova/nova/network/neutron.py:567}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=203eec11-447b-4ce1-bee7-8ce3990e23d1) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:16 devstack nova-compute[86443]: INFO os_vif [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:e1:f1,bridge_name='br-int',has_traffic_filtering=True,id=6c46fa4b-1cd3-4216-b294-254a49b6191c,network=Network(fd6aab17-0e41-43d3-8659-62f185eed00e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46fa4b-1c') Mai 07 19:34:16 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Deleting instance files /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735_del Mai 07 19:34:16 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Deletion of /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735_del complete Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-85b92167-e863-4f5e-86d9-aa74fc991306 req-037496f9-25c1-4f4e-9072-6baa0600bdce service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Received event network-changed-6f6f7733-b69d-4c99-94fd-93b769a37ed8 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-85b92167-e863-4f5e-86d9-aa74fc991306 req-037496f9-25c1-4f4e-9072-6baa0600bdce service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Refreshing instance network info cache due to event network-changed-6f6f7733-b69d-4c99-94fd-93b769a37ed8. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-85b92167-e863-4f5e-86d9-aa74fc991306 req-037496f9-25c1-4f4e-9072-6baa0600bdce service nova] Acquiring lock "refresh_cache-2f733e5d-791b-4963-a6f4-4e64ea8da505" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-85b92167-e863-4f5e-86d9-aa74fc991306 req-037496f9-25c1-4f4e-9072-6baa0600bdce service nova] Acquired lock "refresh_cache-2f733e5d-791b-4963-a6f4-4e64ea8da505" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-85b92167-e863-4f5e-86d9-aa74fc991306 req-037496f9-25c1-4f4e-9072-6baa0600bdce service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Refreshing network info cache for port 6f6f7733-b69d-4c99-94fd-93b769a37ed8 {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Start building block device mappings for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3038}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ebe5afcf74d424367d81a76637d00b51ba3dae92.part --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:16 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-545bb2a5-a61e-45bb-8cc7-a8d8b5f9d50f req-ffe91853-c94b-42d0-bba9-a96f823f547c service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Acquiring lock "refresh_cache-2f733e5d-791b-4963-a6f4-4e64ea8da505" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ebe5afcf74d424367d81a76637d00b51ba3dae92.part --force-share --output=json" returned: 0 in 0.134s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG nova.virt.images [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] fe5635da-867a-46a8-bf75-e6ea9504035d was qcow2, converting to raw {{(pid=86443) fetch_to_raw /opt/stack/nova/nova/virt/images.py:278}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG nova.privsep.utils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=86443) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/ebe5afcf74d424367d81a76637d00b51ba3dae92.part /opt/stack/data/nova/instances/_base/ebe5afcf74d424367d81a76637d00b51ba3dae92.converted {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:16 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-85b92167-e863-4f5e-86d9-aa74fc991306 req-037496f9-25c1-4f4e-9072-6baa0600bdce service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:16 devstack nova-compute[86443]: INFO nova.scheduler.client.report [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Deleted allocations for instance 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/ebe5afcf74d424367d81a76637d00b51ba3dae92.part /opt/stack/data/nova/instances/_base/ebe5afcf74d424367d81a76637d00b51ba3dae92.converted" returned: 0 in 0.252s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:16 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-85b92167-e863-4f5e-86d9-aa74fc991306 req-037496f9-25c1-4f4e-9072-6baa0600bdce service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ebe5afcf74d424367d81a76637d00b51ba3dae92.converted --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ebe5afcf74d424367d81a76637d00b51ba3dae92.converted --force-share --output=json" returned: 0 in 0.295s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "ebe5afcf74d424367d81a76637d00b51ba3dae92" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.935s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ebe5afcf74d424367d81a76637d00b51ba3dae92 --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Successfully created port: d2b17082-6210-48f7-a4ec-b177a18587ac {{(pid=86443) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:529}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ebe5afcf74d424367d81a76637d00b51ba3dae92 --force-share --output=json" returned: 0 in 0.157s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Acquiring lock "ebe5afcf74d424367d81a76637d00b51ba3dae92" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "ebe5afcf74d424367d81a76637d00b51ba3dae92" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ebe5afcf74d424367d81a76637d00b51ba3dae92 --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-85b92167-e863-4f5e-86d9-aa74fc991306 req-037496f9-25c1-4f4e-9072-6baa0600bdce service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:34:17 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-545bb2a5-a61e-45bb-8cc7-a8d8b5f9d50f req-ffe91853-c94b-42d0-bba9-a96f823f547c service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Start spawning the instance on the hypervisor. {{(pid=86443) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2811}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Creating instance directory {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5215}} Mai 07 19:34:17 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Creating image(s) Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "/opt/stack/data/nova/instances/fce02c77-7e4b-4107-bca0-3e4453fd67f8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "/opt/stack/data/nova/instances/fce02c77-7e4b-4107-bca0-3e4453fd67f8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "/opt/stack/data/nova/instances/fce02c77-7e4b-4107-bca0-3e4453fd67f8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ebe5afcf74d424367d81a76637d00b51ba3dae92 --force-share --output=json" returned: 0 in 0.189s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/ebe5afcf74d424367d81a76637d00b51ba3dae92,backing_fmt=raw /opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505/disk 1073741824 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/ebe5afcf74d424367d81a76637d00b51ba3dae92,backing_fmt=raw /opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505/disk 1073741824" returned: 0 in 0.120s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "ebe5afcf74d424367d81a76637d00b51ba3dae92" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.339s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ebe5afcf74d424367d81a76637d00b51ba3dae92 --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.198s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "d8d56ca44922efe85609619d01052c20f44c056a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-545bb2a5-a61e-45bb-8cc7-a8d8b5f9d50f req-ffe91853-c94b-42d0-bba9-a96f823f547c service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Updated VIF entry in instance network info cache for port 6c46fa4b-1cd3-4216-b294-254a49b6191c. {{(pid=86443) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3521}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-545bb2a5-a61e-45bb-8cc7-a8d8b5f9d50f req-ffe91853-c94b-42d0-bba9-a96f823f547c service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Updating instance_info_cache with network_info: [{"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": null, "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.68", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ebe5afcf74d424367d81a76637d00b51ba3dae92 --force-share --output=json" returned: 0 in 0.241s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Checking if we can resize image /opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505/disk. size=1073741824 {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:178}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-85b92167-e863-4f5e-86d9-aa74fc991306 req-037496f9-25c1-4f4e-9072-6baa0600bdce service nova] Releasing lock "refresh_cache-2f733e5d-791b-4963-a6f4-4e64ea8da505" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Acquired lock "refresh_cache-2f733e5d-791b-4963-a6f4-4e64ea8da505" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.259s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/fce02c77-7e4b-4107-bca0-3e4453fd67f8/disk 1073741824 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/fce02c77-7e4b-4107-bca0-3e4453fd67f8/disk 1073741824" returned: 0 in 0.113s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.410s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505/disk --force-share --output=json" returned: 0 in 0.291s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Cannot resize image /opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505/disk to a smaller size. {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:184}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Created local disks {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5347}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Ensure instance console log exists: /opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505/console.log {{(pid=86443) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:5094}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.170s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Checking if we can resize image /opt/stack/data/nova/instances/fce02c77-7e4b-4107-bca0-3e4453fd67f8/disk. size=1073741824 {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:178}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fce02c77-7e4b-4107-bca0-3e4453fd67f8/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fce02c77-7e4b-4107-bca0-3e4453fd67f8/disk --force-share --output=json" returned: 0 in 0.169s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Cannot resize image /opt/stack/data/nova/instances/fce02c77-7e4b-4107-bca0-3e4453fd67f8/disk to a smaller size. {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:184}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Created local disks {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5347}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Ensure instance console log exists: /opt/stack/data/nova/instances/fce02c77-7e4b-4107-bca0-3e4453fd67f8/console.log {{(pid=86443) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:5094}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-545bb2a5-a61e-45bb-8cc7-a8d8b5f9d50f req-ffe91853-c94b-42d0-bba9-a96f823f547c service nova] Releasing lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.393s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:18 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Successfully updated port: d2b17082-6210-48f7-a4ec-b177a18587ac {{(pid=86443) _update_port /opt/stack/nova/nova/network/neutron.py:567}} Mai 07 19:34:18 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-16443665-b0cd-44d7-a3b5-6454f7a14279 req-56a28aef-625c-41be-ac08-fc918b3b704b service nova] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Received event network-changed-d2b17082-6210-48f7-a4ec-b177a18587ac {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-16443665-b0cd-44d7-a3b5-6454f7a14279 req-56a28aef-625c-41be-ac08-fc918b3b704b service nova] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Refreshing instance network info cache due to event network-changed-d2b17082-6210-48f7-a4ec-b177a18587ac. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-16443665-b0cd-44d7-a3b5-6454f7a14279 req-56a28aef-625c-41be-ac08-fc918b3b704b service nova] Acquiring lock "refresh_cache-fce02c77-7e4b-4107-bca0-3e4453fd67f8" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-16443665-b0cd-44d7-a3b5-6454f7a14279 req-56a28aef-625c-41be-ac08-fc918b3b704b service nova] Acquired lock "refresh_cache-fce02c77-7e4b-4107-bca0-3e4453fd67f8" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-16443665-b0cd-44d7-a3b5-6454f7a14279 req-56a28aef-625c-41be-ac08-fc918b3b704b service nova] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Refreshing network info cache for port d2b17082-6210-48f7-a4ec-b177a18587ac {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Updating instance_info_cache with network_info: [{"id": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "address": "fa:16:3e:73:73:4f", "network": {"id": "72c7f05a-3795-4a85-bb7b-f3068e30f292", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1642257360-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d215b3694bec42629ea172bb6d560623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6f7733-b6", "ovs_interfaceid": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.utils [-] Task(fn=>, remaining_delay=-0.0024822790001053363 future=) submitted to {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.utils [-] Waiting for the next task {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:34:19 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] VM Stopped (Lifecycle Event) Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-c5208d0b-80d7-4ae0-ae0b-e2f10095bc72 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" "released" by "nova.compute.manager.ComputeManager.shelve_instance..do_shelve_instance" :: held 16.371s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "refresh_cache-fce02c77-7e4b-4107-bca0-3e4453fd67f8" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:34:19 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-16443665-b0cd-44d7-a3b5-6454f7a14279 req-56a28aef-625c-41be-ac08-fc918b3b704b service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Releasing lock "refresh_cache-2f733e5d-791b-4963-a6f4-4e64ea8da505" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Instance network_info: |[{"id": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "address": "fa:16:3e:73:73:4f", "network": {"id": "72c7f05a-3795-4a85-bb7b-f3068e30f292", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1642257360-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d215b3694bec42629ea172bb6d560623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6f7733-b6", "ovs_interfaceid": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2163}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Start _get_guest_xml network_info=[{"id": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "address": "fa:16:3e:73:73:4f", "network": {"id": "72c7f05a-3795-4a85-bb7b-f3068e30f292", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1642257360-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d215b3694bec42629ea172bb6d560623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6f7733-b6", "ovs_interfaceid": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'scsi', 'cdrom_bus': 'scsi', 'mapping': {'root': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'scsi', 'dev': 'sdb', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:34:00Z,direct_url=,disk_format='qcow2',id=fe5635da-867a-46a8-bf75-e6ea9504035d,min_disk=0,min_ram=0,name='',owner='21a92d23a8dd4a75a2eda8cfe6144e86',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:34:02Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/sda', 'image': [{'encryption_format': None, 'size': 0, 'disk_bus': 'scsi', 'encrypted': False, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/sda', 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'image_id': 'fe5635da-867a-46a8-bf75-e6ea9504035d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8192}} Mai 07 19:34:19 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='fe5635da-867a-46a8-bf75-e6ea9504035d', instance_meta=NovaInstanceMeta(name='tempest-AttachSCSIVolumeTestJSON-server-807452503', uuid='2f733e5d-791b-4963-a6f4-4e64ea8da505'), owner=OwnerMeta(userid='6175af70f2de4ecdb264e34fbc50978b', username='tempest-AttachSCSIVolumeTestJSON-1466334568-project-member', projectid='d215b3694bec42629ea172bb6d560623', projectname='tempest-AttachSCSIVolumeTestJSON-1466334568'), image=ImageMeta(id='fe5635da-867a-46a8-bf75-e6ea9504035d', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'scsi', 'hw_disk_bus': 'scsi', 'hw_scsi_model': 'virtio-scsi'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "address": "fa:16:3e:73:73:4f", "network": {"id": "72c7f05a-3795-4a85-bb7b-f3068e30f292", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1642257360-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d215b3694bec42629ea172bb6d560623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6f7733-b6", "ovs_interfaceid": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175259.6451485) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Searching host: 'devstack' for CPU controller through CGroups V1... {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1783}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] CPU controller missing on host. {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1793}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Searching host: 'devstack' for CPU controller through CGroups V2... {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1802}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] CPU controller found on host. {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1809}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Getting desirable topologies for flavor Flavor(created_at=2026-05-07T17:26:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=192,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:34:00Z,direct_url=,disk_format='qcow2',id=fe5635da-867a-46a8-bf75-e6ea9504035d,min_disk=0,min_ram=0,name='',owner='21a92d23a8dd4a75a2eda8cfe6144e86',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:34:02Z,virtual_size=,visibility=), allow threads: True {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:703}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Flavor limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:488}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Image limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:492}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Flavor pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:528}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Image pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:532}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:570}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:709}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:611}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Got 1 possible topologies {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:641}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:715}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:717}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-16443665-b0cd-44d7-a3b5-6454f7a14279 req-56a28aef-625c-41be-ac08-fc918b3b704b service nova] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-05-07T17:34:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-807452503',display_name='tempest-AttachSCSIVolumeTestJSON-server-807452503',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachscsivolumetestjson-server-807452503',id=5,image_ref='fe5635da-867a-46a8-bf75-e6ea9504035d',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL7h3ZPXkyoD78FcTp5+unliSvNdw2hIB7eeSTDaZlr2rEarYSadi5aDijrDAGDwS3VXO5icJHGrUQzkFPzKzm0qyHBYajDVGVeP5jSpeSG5zCr8qlH9UcWsWDqbDlagFQ==',key_name='tempest-keypair-313311142',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d215b3694bec42629ea172bb6d560623',ramdisk_id='',reservation_id='r-okh73x6f',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='fe5635da-867a-46a8-bf75-e6ea9504035d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1466334568',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1466334568-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:34:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6175af70f2de4ecdb264e34fbc50978b',uuid=2f733e5d-791b-4963-a6f4-4e64ea8da505,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "address": "fa:16:3e:73:73:4f", "network": {"id": "72c7f05a-3795-4a85-bb7b-f3068e30f292", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1642257360-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d215b3694bec42629ea172bb6d560623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6f7733-b6", "ovs_interfaceid": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Converting VIF {"id": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "address": "fa:16:3e:73:73:4f", "network": {"id": "72c7f05a-3795-4a85-bb7b-f3068e30f292", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1642257360-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d215b3694bec42629ea172bb6d560623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6f7733-b6", "ovs_interfaceid": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:73:73:4f,bridge_name='br-int',has_traffic_filtering=True,id=6f6f7733-b69d-4c99-94fd-93b769a37ed8,network=Network(72c7f05a-3795-4a85-bb7b-f3068e30f292),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f6f7733-b6') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lazy-loading 'pci_devices' on Instance uuid 2f733e5d-791b-4963-a6f4-4e64ea8da505 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-53407833-0f0d-43af-8255-0c3891633663 None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:19 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-16443665-b0cd-44d7-a3b5-6454f7a14279 req-56a28aef-625c-41be-ac08-fc918b3b704b service nova] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] End _get_guest_xml xml= Mai 07 19:34:20 devstack nova-compute[86443]: 2f733e5d-791b-4963-a6f4-4e64ea8da505 Mai 07 19:34:20 devstack nova-compute[86443]: instance-00000005 Mai 07 19:34:20 devstack nova-compute[86443]: 196608 Mai 07 19:34:20 devstack nova-compute[86443]: 1 Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: tempest-AttachSCSIVolumeTestJSON-server-807452503 Mai 07 19:34:20 devstack nova-compute[86443]: 2026-05-07 17:34:19 Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: 192 Mai 07 19:34:20 devstack nova-compute[86443]: 1 Mai 07 19:34:20 devstack nova-compute[86443]: 0 Mai 07 19:34:20 devstack nova-compute[86443]: 0 Mai 07 19:34:20 devstack nova-compute[86443]: 1 Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: True Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: bare Mai 07 19:34:20 devstack nova-compute[86443]: qcow2 Mai 07 19:34:20 devstack nova-compute[86443]: 1 Mai 07 19:34:20 devstack nova-compute[86443]: 0 Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: scsi Mai 07 19:34:20 devstack nova-compute[86443]: scsi Mai 07 19:34:20 devstack nova-compute[86443]: virtio-scsi Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: tempest-AttachSCSIVolumeTestJSON-1466334568-project-member Mai 07 19:34:20 devstack nova-compute[86443]: tempest-AttachSCSIVolumeTestJSON-1466334568 Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: OpenStack Foundation Mai 07 19:34:20 devstack nova-compute[86443]: OpenStack Nova Mai 07 19:34:20 devstack nova-compute[86443]: 33.1.0 Mai 07 19:34:20 devstack nova-compute[86443]: 2f733e5d-791b-4963-a6f4-4e64ea8da505 Mai 07 19:34:20 devstack nova-compute[86443]: 2f733e5d-791b-4963-a6f4-4e64ea8da505 Mai 07 19:34:20 devstack nova-compute[86443]: Virtual Machine Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: hvm Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: 1 Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]:
Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]:
Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: /dev/urandom Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: Mai 07 19:34:20 devstack nova-compute[86443]: {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8199}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Preparing to wait for external event network-vif-plugged-6f6f7733-b69d-4c99-94fd-93b769a37ed8 {{(pid=86443) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:306}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Acquiring lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-05-07T17:34:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-807452503',display_name='tempest-AttachSCSIVolumeTestJSON-server-807452503',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachscsivolumetestjson-server-807452503',id=5,image_ref='fe5635da-867a-46a8-bf75-e6ea9504035d',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL7h3ZPXkyoD78FcTp5+unliSvNdw2hIB7eeSTDaZlr2rEarYSadi5aDijrDAGDwS3VXO5icJHGrUQzkFPzKzm0qyHBYajDVGVeP5jSpeSG5zCr8qlH9UcWsWDqbDlagFQ==',key_name='tempest-keypair-313311142',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d215b3694bec42629ea172bb6d560623',ramdisk_id='',reservation_id='r-okh73x6f',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='fe5635da-867a-46a8-bf75-e6ea9504035d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1466334568',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1466334568-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:34:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6175af70f2de4ecdb264e34fbc50978b',uuid=2f733e5d-791b-4963-a6f4-4e64ea8da505,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "address": "fa:16:3e:73:73:4f", "network": {"id": "72c7f05a-3795-4a85-bb7b-f3068e30f292", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1642257360-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d215b3694bec42629ea172bb6d560623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6f7733-b6", "ovs_interfaceid": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) plug /opt/stack/nova/nova/virt/libvirt/vif.py:763}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Converting VIF {"id": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "address": "fa:16:3e:73:73:4f", "network": {"id": "72c7f05a-3795-4a85-bb7b-f3068e30f292", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1642257360-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d215b3694bec42629ea172bb6d560623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6f7733-b6", "ovs_interfaceid": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:73:73:4f,bridge_name='br-int',has_traffic_filtering=True,id=6f6f7733-b69d-4c99-94fd-93b769a37ed8,network=Network(72c7f05a-3795-4a85-bb7b-f3068e30f292),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f6f7733-b6') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG os_vif [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:73:4f,bridge_name='br-int',has_traffic_filtering=True,id=6f6f7733-b69d-4c99-94fd-93b769a37ed8,network=Network(72c7f05a-3795-4a85-bb7b-f3068e30f292),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f6f7733-b6') {{(pid=86443) plug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:76}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'bf2957bb-bed1-5ad7-bcb0-5333284b3441', '_type': 'linux-noop'}}, row=False) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6f6f7733-b6, may_exist=True, interface_attrs={}) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap6f6f7733-b6, col_values=(('qos', UUID('e1da426e-6119-4610-83d9-b59f3e5bb1b1')),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap6f6f7733-b6, col_values=(('external_ids', {'iface-id': '6f6f7733-b69d-4c99-94fd-93b769a37ed8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:73:73:4f', 'vm-uuid': '2f733e5d-791b-4963-a6f4-4e64ea8da505'}),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:20 devstack nova-compute[86443]: INFO os_vif [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:73:4f,bridge_name='br-int',has_traffic_filtering=True,id=6f6f7733-b69d-4c99-94fd-93b769a37ed8,network=Network(72c7f05a-3795-4a85-bb7b-f3068e30f292),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f6f7733-b6') Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-16443665-b0cd-44d7-a3b5-6454f7a14279 req-56a28aef-625c-41be-ac08-fc918b3b704b service nova] Releasing lock "refresh_cache-fce02c77-7e4b-4107-bca0-3e4453fd67f8" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquired lock "refresh_cache-fce02c77-7e4b-4107-bca0-3e4453fd67f8" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:34:20 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:34:21 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Updating instance_info_cache with network_info: [{"id": "d2b17082-6210-48f7-a4ec-b177a18587ac", "address": "fa:16:3e:b2:99:94", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2b17082-62", "ovs_interfaceid": "d2b17082-6210-48f7-a4ec-b177a18587ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Releasing lock "refresh_cache-fce02c77-7e4b-4107-bca0-3e4453fd67f8" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Instance network_info: |[{"id": "d2b17082-6210-48f7-a4ec-b177a18587ac", "address": "fa:16:3e:b2:99:94", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2b17082-62", "ovs_interfaceid": "d2b17082-6210-48f7-a4ec-b177a18587ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2163}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Start _get_guest_xml network_info=[{"id": "d2b17082-6210-48f7-a4ec-b177a18587ac", "address": "fa:16:3e:b2:99:94", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2b17082-62", "ovs_interfaceid": "d2b17082-6210-48f7-a4ec-b177a18587ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'image_id': 'e8633b10-b98a-4580-90f8-3091ca40fa29'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8192}} Mai 07 19:34:21 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='e8633b10-b98a-4580-90f8-3091ca40fa29', instance_meta=NovaInstanceMeta(name='tempest-ServersNegativeTestJSON-server-689812090', uuid='fce02c77-7e4b-4107-bca0-3e4453fd67f8'), owner=OwnerMeta(userid='e7648a0472014bc2be2325afb69590b9', username='tempest-ServersNegativeTestJSON-1436591115-project-member', projectid='f9f94b6740934688aeea63198166dda4', projectname='tempest-ServersNegativeTestJSON-1436591115'), image=ImageMeta(id='e8633b10-b98a-4580-90f8-3091ca40fa29', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "d2b17082-6210-48f7-a4ec-b177a18587ac", "address": "fa:16:3e:b2:99:94", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2b17082-62", "ovs_interfaceid": "d2b17082-6210-48f7-a4ec-b177a18587ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175261.729551) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Searching host: 'devstack' for CPU controller through CGroups V1... {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1783}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] CPU controller missing on host. {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1793}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Searching host: 'devstack' for CPU controller through CGroups V2... {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1802}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] CPU controller found on host. {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1809}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Getting desirable topologies for flavor Flavor(created_at=2026-05-07T17:26:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=192,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=), allow threads: True {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:703}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Flavor limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:488}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Image limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:492}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Flavor pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:528}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Image pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:532}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:570}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:709}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:611}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Got 1 possible topologies {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:641}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:715}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:717}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" by "nova.compute.manager.ComputeManager.unshelve_instance..do_unshelve_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" acquired by "nova.compute.manager.ComputeManager.unshelve_instance..do_unshelve_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:21 devstack nova-compute[86443]: INFO nova.compute.manager [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Unshelving Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:34:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-689812090',display_name='tempest-ServersNegativeTestJSON-server-689812090',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-serversnegativetestjson-server-689812090',id=6,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f9f94b6740934688aeea63198166dda4',ramdisk_id='',reservation_id='r-ptn0rtzn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1436591115',owner_user_name='tempest-ServersNegativeTestJSON-1436591115-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:34:17Z,user_data=None,user_id='e7648a0472014bc2be2325afb69590b9',uuid=fce02c77-7e4b-4107-bca0-3e4453fd67f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d2b17082-6210-48f7-a4ec-b177a18587ac", "address": "fa:16:3e:b2:99:94", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2b17082-62", "ovs_interfaceid": "d2b17082-6210-48f7-a4ec-b177a18587ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Converting VIF {"id": "d2b17082-6210-48f7-a4ec-b177a18587ac", "address": "fa:16:3e:b2:99:94", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2b17082-62", "ovs_interfaceid": "d2b17082-6210-48f7-a4ec-b177a18587ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:99:94,bridge_name='br-int',has_traffic_filtering=True,id=d2b17082-6210-48f7-a4ec-b177a18587ac,network=Network(d7a1572e-15cd-4740-b77a-a3a68ea38663),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2b17082-62') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lazy-loading 'pci_devices' on Instance uuid fce02c77-7e4b-4107-bca0-3e4453fd67f8 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] No BDM found with device name sda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] No BDM found with device name sdb, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:34:21 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] No VIF found with MAC fa:16:3e:73:73:4f, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:34:21 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Using config drive Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] End _get_guest_xml xml= Mai 07 19:34:22 devstack nova-compute[86443]: fce02c77-7e4b-4107-bca0-3e4453fd67f8 Mai 07 19:34:22 devstack nova-compute[86443]: instance-00000006 Mai 07 19:34:22 devstack nova-compute[86443]: 196608 Mai 07 19:34:22 devstack nova-compute[86443]: 1 Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: tempest-ServersNegativeTestJSON-server-689812090 Mai 07 19:34:22 devstack nova-compute[86443]: 2026-05-07 17:34:21 Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: 192 Mai 07 19:34:22 devstack nova-compute[86443]: 1 Mai 07 19:34:22 devstack nova-compute[86443]: 0 Mai 07 19:34:22 devstack nova-compute[86443]: 0 Mai 07 19:34:22 devstack nova-compute[86443]: 1 Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: True Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: bare Mai 07 19:34:22 devstack nova-compute[86443]: qcow2 Mai 07 19:34:22 devstack nova-compute[86443]: 1 Mai 07 19:34:22 devstack nova-compute[86443]: 0 Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: virtio Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: tempest-ServersNegativeTestJSON-1436591115-project-member Mai 07 19:34:22 devstack nova-compute[86443]: tempest-ServersNegativeTestJSON-1436591115 Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: OpenStack Foundation Mai 07 19:34:22 devstack nova-compute[86443]: OpenStack Nova Mai 07 19:34:22 devstack nova-compute[86443]: 33.1.0 Mai 07 19:34:22 devstack nova-compute[86443]: fce02c77-7e4b-4107-bca0-3e4453fd67f8 Mai 07 19:34:22 devstack nova-compute[86443]: fce02c77-7e4b-4107-bca0-3e4453fd67f8 Mai 07 19:34:22 devstack nova-compute[86443]: Virtual Machine Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: hvm Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: 1 Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: /dev/urandom Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: Mai 07 19:34:22 devstack nova-compute[86443]: {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8199}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Preparing to wait for external event network-vif-plugged-d2b17082-6210-48f7-a4ec-b177a18587ac {{(pid=86443) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:306}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:34:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-689812090',display_name='tempest-ServersNegativeTestJSON-server-689812090',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-serversnegativetestjson-server-689812090',id=6,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f9f94b6740934688aeea63198166dda4',ramdisk_id='',reservation_id='r-ptn0rtzn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1436591115',owner_user_name='tempest-ServersNegativeTestJSON-1436591115-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:34:17Z,user_data=None,user_id='e7648a0472014bc2be2325afb69590b9',uuid=fce02c77-7e4b-4107-bca0-3e4453fd67f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d2b17082-6210-48f7-a4ec-b177a18587ac", "address": "fa:16:3e:b2:99:94", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2b17082-62", "ovs_interfaceid": "d2b17082-6210-48f7-a4ec-b177a18587ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) plug /opt/stack/nova/nova/virt/libvirt/vif.py:763}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Converting VIF {"id": "d2b17082-6210-48f7-a4ec-b177a18587ac", "address": "fa:16:3e:b2:99:94", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2b17082-62", "ovs_interfaceid": "d2b17082-6210-48f7-a4ec-b177a18587ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:99:94,bridge_name='br-int',has_traffic_filtering=True,id=d2b17082-6210-48f7-a4ec-b177a18587ac,network=Network(d7a1572e-15cd-4740-b77a-a3a68ea38663),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2b17082-62') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG os_vif [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:99:94,bridge_name='br-int',has_traffic_filtering=True,id=d2b17082-6210-48f7-a4ec-b177a18587ac,network=Network(d7a1572e-15cd-4740-b77a-a3a68ea38663),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2b17082-62') {{(pid=86443) plug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:76}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '57746040-c4ed-59de-95d2-96893f876791', '_type': 'linux-noop'}}, row=False) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2b17082-62, may_exist=True, interface_attrs={}) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapd2b17082-62, col_values=(('qos', UUID('4ab8a09a-513d-4407-9f6d-9bf0d67b2e72')),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapd2b17082-62, col_values=(('external_ids', {'iface-id': 'd2b17082-6210-48f7-a4ec-b177a18587ac', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b2:99:94', 'vm-uuid': 'fce02c77-7e4b-4107-bca0-3e4453fd67f8'}),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:22 devstack nova-compute[86443]: INFO os_vif [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:99:94,bridge_name='br-int',has_traffic_filtering=True,id=d2b17082-6210-48f7-a4ec-b177a18587ac,network=Network(d7a1572e-15cd-4740-b77a-a3a68ea38663),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2b17082-62') Mai 07 19:34:22 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:22 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Creating config drive at /opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505/disk.config Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 33.1.0 -quiet -J -r -V config-2 /tmp/tmpem4atzru {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 33.1.0 -quiet -J -r -V config-2 /tmp/tmpem4atzru" returned: 0 in 0.043s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:22 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lazy-loading 'pci_requests' on Instance uuid 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:34:23 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-036ccd80-782b-46f3-89b8-f8844c8b9912 req-dd90fc9e-26cc-4a4f-8f86-1143cecc3bf3 service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Received event network-vif-plugged-6f6f7733-b69d-4c99-94fd-93b769a37ed8 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:23 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-036ccd80-782b-46f3-89b8-f8844c8b9912 req-dd90fc9e-26cc-4a4f-8f86-1143cecc3bf3 service nova] Acquiring lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:23 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-036ccd80-782b-46f3-89b8-f8844c8b9912 req-dd90fc9e-26cc-4a4f-8f86-1143cecc3bf3 service nova] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:23 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-036ccd80-782b-46f3-89b8-f8844c8b9912 req-dd90fc9e-26cc-4a4f-8f86-1143cecc3bf3 service nova] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:23 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-036ccd80-782b-46f3-89b8-f8844c8b9912 req-dd90fc9e-26cc-4a4f-8f86-1143cecc3bf3 service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Processing event network-vif-plugged-6f6f7733-b69d-4c99-94fd-93b769a37ed8 {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11746}} Mai 07 19:34:23 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lazy-loading 'numa_topology' on Instance uuid 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:34:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:23 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=86443) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2630}} Mai 07 19:34:23 devstack nova-compute[86443]: INFO nova.compute.claims [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Claim successful on node devstack Mai 07 19:34:23 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:34:23 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] No VIF found with MAC fa:16:3e:b2:99:94, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:34:24 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:24 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:24 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Started> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:34:24 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] VM Started (Lifecycle Event) Mai 07 19:34:24 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=86443) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:601}} Mai 07 19:34:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Guest created on hypervisor {{(pid=86443) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4893}} Mai 07 19:34:24 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Instance spawned successfully. Mai 07 19:34:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Attempting to register defaults for the following image properties: ['hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1012}} Mai 07 19:34:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Found default for hw_input_bus of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:34:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Found default for hw_pointer_model of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:34:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Found default for hw_video_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:34:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Found default for hw_vif_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:34:24 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:24 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:34:24 devstack nova-compute[86443]: INFO nova.compute.manager [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Took 9.35 seconds to spawn the instance on the hypervisor. Mai 07 19:34:24 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-047ff4a1-e761-4965-b607-330cdadb9cbb req-ba4f46a3-8b6a-42e6-b682-8a067a202ae6 service nova] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Received event network-vif-plugged-d2b17082-6210-48f7-a4ec-b177a18587ac {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-047ff4a1-e761-4965-b607-330cdadb9cbb req-ba4f46a3-8b6a-42e6-b682-8a067a202ae6 service nova] Acquiring lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-047ff4a1-e761-4965-b607-330cdadb9cbb req-ba4f46a3-8b6a-42e6-b682-8a067a202ae6 service nova] Lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-047ff4a1-e761-4965-b607-330cdadb9cbb req-ba4f46a3-8b6a-42e6-b682-8a067a202ae6 service nova] Lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.004s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-047ff4a1-e761-4965-b607-330cdadb9cbb req-ba4f46a3-8b6a-42e6-b682-8a067a202ae6 service nova] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Processing event network-vif-plugged-d2b17082-6210-48f7-a4ec-b177a18587ac {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11746}} Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:25 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] During sync_power_state the instance has a pending task (spawning). Skip. Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Paused> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:34:25 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] VM Paused (Lifecycle Event) Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:25 devstack nova-compute[86443]: INFO nova.compute.manager [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Took 14.74 seconds to build instance. Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-239ab14b-b3fa-4cf4-99ed-22ab33c0a72a req-8bcf99cb-ba77-4076-8988-a2076b449272 service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Received event network-vif-plugged-6f6f7733-b69d-4c99-94fd-93b769a37ed8 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-239ab14b-b3fa-4cf4-99ed-22ab33c0a72a req-8bcf99cb-ba77-4076-8988-a2076b449272 service nova] Acquiring lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-239ab14b-b3fa-4cf4-99ed-22ab33c0a72a req-8bcf99cb-ba77-4076-8988-a2076b449272 service nova] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-239ab14b-b3fa-4cf4-99ed-22ab33c0a72a req-8bcf99cb-ba77-4076-8988-a2076b449272 service nova] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-239ab14b-b3fa-4cf4-99ed-22ab33c0a72a req-8bcf99cb-ba77-4076-8988-a2076b449272 service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] No waiting events found dispatching network-vif-plugged-6f6f7733-b69d-4c99-94fd-93b769a37ed8 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:34:25 devstack nova-compute[86443]: WARNING nova.compute.manager [req-239ab14b-b3fa-4cf4-99ed-22ab33c0a72a req-8bcf99cb-ba77-4076-8988-a2076b449272 service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Received unexpected event network-vif-plugged-6f6f7733-b69d-4c99-94fd-93b769a37ed8 for instance with vm_state active and task_state None. Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Resumed> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:34:25 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] VM Resumed (Lifecycle Event) Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.269s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=86443) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:601}} Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Guest created on hypervisor {{(pid=86443) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4893}} Mai 07 19:34:25 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Instance spawned successfully. Mai 07 19:34:25 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1012}} Mai 07 19:34:26 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:26 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:34:26 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 3.573s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:26 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:26 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Found default for hw_cdrom_bus of ide {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:34:26 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Found default for hw_disk_bus of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:34:26 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Found default for hw_input_bus of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:34:26 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Found default for hw_pointer_model of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:34:26 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Found default for hw_video_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:34:26 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Found default for hw_vif_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:34:26 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:26 devstack nova-compute[86443]: INFO nova.network.neutron [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Updating port 6c46fa4b-1cd3-4216-b294-254a49b6191c with attributes {'binding:host_id': 'devstack', 'device_owner': 'compute:nova'} Mai 07 19:34:26 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Started> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:34:26 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] VM Started (Lifecycle Event) Mai 07 19:34:26 devstack nova-compute[86443]: INFO nova.compute.manager [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Took 9.40 seconds to spawn the instance on the hypervisor. Mai 07 19:34:26 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:27 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:27 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:34:27 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:27 devstack nova-compute[86443]: INFO nova.compute.manager [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Took 14.78 seconds to build instance. Mai 07 19:34:27 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Paused> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:34:27 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] VM Paused (Lifecycle Event) Mai 07 19:34:28 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-57f5d83b-6a92-43ed-ab92-1a219da87af9 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.311s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:28 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:28 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Resumed> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:34:28 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] VM Resumed (Lifecycle Event) Mai 07 19:34:28 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:34:28 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquired lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:34:28 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:34:28 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:28 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:34:29 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:29 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-f9edd2bd-6234-4edf-b78c-e4bf71d65080 req-a2b4a2d8-de9d-4e84-8b3d-05b947113337 service nova] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Received event network-vif-plugged-d2b17082-6210-48f7-a4ec-b177a18587ac {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:29 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-f9edd2bd-6234-4edf-b78c-e4bf71d65080 req-a2b4a2d8-de9d-4e84-8b3d-05b947113337 service nova] Acquiring lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:29 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-f9edd2bd-6234-4edf-b78c-e4bf71d65080 req-a2b4a2d8-de9d-4e84-8b3d-05b947113337 service nova] Lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.004s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:29 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-f9edd2bd-6234-4edf-b78c-e4bf71d65080 req-a2b4a2d8-de9d-4e84-8b3d-05b947113337 service nova] Lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:29 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-f9edd2bd-6234-4edf-b78c-e4bf71d65080 req-a2b4a2d8-de9d-4e84-8b3d-05b947113337 service nova] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] No waiting events found dispatching network-vif-plugged-d2b17082-6210-48f7-a4ec-b177a18587ac {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:34:29 devstack nova-compute[86443]: WARNING nova.compute.manager [req-f9edd2bd-6234-4edf-b78c-e4bf71d65080 req-a2b4a2d8-de9d-4e84-8b3d-05b947113337 service nova] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Received unexpected event network-vif-plugged-d2b17082-6210-48f7-a4ec-b177a18587ac for instance with vm_state active and task_state None. Mai 07 19:34:29 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-daa6dfb7-0536-4ae2-8ad9-e9fbd4b006ca req-036c7395-42cc-4978-8eaf-d01ff2d50d8e service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received event network-changed-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:29 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-daa6dfb7-0536-4ae2-8ad9-e9fbd4b006ca req-036c7395-42cc-4978-8eaf-d01ff2d50d8e service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Refreshing instance network info cache due to event network-changed-6c46fa4b-1cd3-4216-b294-254a49b6191c. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:34:29 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-daa6dfb7-0536-4ae2-8ad9-e9fbd4b006ca req-036c7395-42cc-4978-8eaf-d01ff2d50d8e service nova] Acquiring lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:34:29 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:29 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Updating instance_info_cache with network_info: [{"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.68", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:34:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Releasing lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:34:30 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Creating instance directory {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5215}} Mai 07 19:34:30 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Creating image(s) Mai 07 19:34:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-daa6dfb7-0536-4ae2-8ad9-e9fbd4b006ca req-036c7395-42cc-4978-8eaf-d01ff2d50d8e service nova] Acquired lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:34:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "/opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:30 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-daa6dfb7-0536-4ae2-8ad9-e9fbd4b006ca req-036c7395-42cc-4978-8eaf-d01ff2d50d8e service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Refreshing network info cache for port 6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:34:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "/opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "/opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.006s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:30 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lazy-loading 'trusted_certs' on Instance uuid 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:34:30 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-daa6dfb7-0536-4ae2-8ad9-e9fbd4b006ca req-036c7395-42cc-4978-8eaf-d01ff2d50d8e service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:30 devstack nova-compute[86443]: DEBUG nova.objects.base [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Object Instance<462f0fe5-72f4-445d-8e8d-b9fd3a9c0735> lazy-loaded attributes: pci_requests,trusted_certs {{(pid=86443) wrapper /opt/stack/nova/nova/objects/base.py:136}} Mai 07 19:34:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "4273af6580afcfa9e6ec381175b32f5649ae4f7b" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:30 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "4273af6580afcfa9e6ec381175b32f5649ae4f7b" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:31 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:31 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-c76ee0c6-9854-487f-9837-6e3355b06c1b req-d406f51a-8358-44ec-a850-c247a010367b service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Received event network-changed-6f6f7733-b69d-4c99-94fd-93b769a37ed8 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:31 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-c76ee0c6-9854-487f-9837-6e3355b06c1b req-d406f51a-8358-44ec-a850-c247a010367b service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Refreshing instance network info cache due to event network-changed-6f6f7733-b69d-4c99-94fd-93b769a37ed8. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:34:31 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-c76ee0c6-9854-487f-9837-6e3355b06c1b req-d406f51a-8358-44ec-a850-c247a010367b service nova] Acquiring lock "refresh_cache-2f733e5d-791b-4963-a6f4-4e64ea8da505" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:34:31 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-c76ee0c6-9854-487f-9837-6e3355b06c1b req-d406f51a-8358-44ec-a850-c247a010367b service nova] Acquired lock "refresh_cache-2f733e5d-791b-4963-a6f4-4e64ea8da505" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:34:31 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-c76ee0c6-9854-487f-9837-6e3355b06c1b req-d406f51a-8358-44ec-a850-c247a010367b service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Refreshing network info cache for port 6f6f7733-b69d-4c99-94fd-93b769a37ed8 {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:34:31 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-daa6dfb7-0536-4ae2-8ad9-e9fbd4b006ca req-036c7395-42cc-4978-8eaf-d01ff2d50d8e service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:32 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-c76ee0c6-9854-487f-9837-6e3355b06c1b req-d406f51a-8358-44ec-a850-c247a010367b service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:32 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:32 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-daa6dfb7-0536-4ae2-8ad9-e9fbd4b006ca req-036c7395-42cc-4978-8eaf-d01ff2d50d8e service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Updated VIF entry in instance network info cache for port 6c46fa4b-1cd3-4216-b294-254a49b6191c. {{(pid=86443) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3521}} Mai 07 19:34:32 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-daa6dfb7-0536-4ae2-8ad9-e9fbd4b006ca req-036c7395-42cc-4978-8eaf-d01ff2d50d8e service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Updating instance_info_cache with network_info: [{"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.68", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:34:32 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-daa6dfb7-0536-4ae2-8ad9-e9fbd4b006ca req-036c7395-42cc-4978-8eaf-d01ff2d50d8e service nova] Releasing lock "refresh_cache-462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:34:33 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-c76ee0c6-9854-487f-9837-6e3355b06c1b req-d406f51a-8358-44ec-a850-c247a010367b service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:33 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-c76ee0c6-9854-487f-9837-6e3355b06c1b req-d406f51a-8358-44ec-a850-c247a010367b service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Updated VIF entry in instance network info cache for port 6f6f7733-b69d-4c99-94fd-93b769a37ed8. {{(pid=86443) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3521}} Mai 07 19:34:33 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-c76ee0c6-9854-487f-9837-6e3355b06c1b req-d406f51a-8358-44ec-a850-c247a010367b service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Updating instance_info_cache with network_info: [{"id": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "address": "fa:16:3e:73:73:4f", "network": {"id": "72c7f05a-3795-4a85-bb7b-f3068e30f292", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1642257360-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d215b3694bec42629ea172bb6d560623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6f7733-b6", "ovs_interfaceid": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:34:33 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-c76ee0c6-9854-487f-9837-6e3355b06c1b req-d406f51a-8358-44ec-a850-c247a010367b service nova] Releasing lock "refresh_cache-2f733e5d-791b-4963-a6f4-4e64ea8da505" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:34:34 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:34 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:34 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4273af6580afcfa9e6ec381175b32f5649ae4f7b.part --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:34 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:34 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:34 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4273af6580afcfa9e6ec381175b32f5649ae4f7b.part --force-share --output=json" returned: 0 in 0.223s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:34 devstack nova-compute[86443]: DEBUG nova.virt.images [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] 131a06d3-e6a9-43eb-96ca-9d76f6c2b34b was qcow2, converting to raw {{(pid=86443) fetch_to_raw /opt/stack/nova/nova/virt/images.py:278}} Mai 07 19:34:34 devstack nova-compute[86443]: DEBUG nova.privsep.utils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=86443) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Mai 07 19:34:34 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/4273af6580afcfa9e6ec381175b32f5649ae4f7b.part /opt/stack/data/nova/instances/_base/4273af6580afcfa9e6ec381175b32f5649ae4f7b.converted {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/4273af6580afcfa9e6ec381175b32f5649ae4f7b.part /opt/stack/data/nova/instances/_base/4273af6580afcfa9e6ec381175b32f5649ae4f7b.converted" returned: 0 in 0.229s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4273af6580afcfa9e6ec381175b32f5649ae4f7b.converted --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Starting instance... {{(pid=86443) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2605}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4273af6580afcfa9e6ec381175b32f5649ae4f7b.converted --force-share --output=json" returned: 0 in 0.238s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "4273af6580afcfa9e6ec381175b32f5649ae4f7b" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 4.374s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4273af6580afcfa9e6ec381175b32f5649ae4f7b --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4273af6580afcfa9e6ec381175b32f5649ae4f7b --force-share --output=json" returned: 0 in 0.244s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "4273af6580afcfa9e6ec381175b32f5649ae4f7b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "4273af6580afcfa9e6ec381175b32f5649ae4f7b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4273af6580afcfa9e6ec381175b32f5649ae4f7b --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4273af6580afcfa9e6ec381175b32f5649ae4f7b --force-share --output=json" returned: 0 in 0.198s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4273af6580afcfa9e6ec381175b32f5649ae4f7b,backing_fmt=raw /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk 1073741824 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.011s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=86443) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2630}} Mai 07 19:34:35 devstack nova-compute[86443]: INFO nova.compute.claims [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Claim successful on node devstack Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4273af6580afcfa9e6ec381175b32f5649ae4f7b,backing_fmt=raw /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk 1073741824" returned: 0 in 0.176s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "4273af6580afcfa9e6ec381175b32f5649ae4f7b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.397s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:35 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4273af6580afcfa9e6ec381175b32f5649ae4f7b --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:36 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4273af6580afcfa9e6ec381175b32f5649ae4f7b --force-share --output=json" returned: 0 in 0.195s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:36 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lazy-loading 'migration_context' on Instance uuid 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:34:36 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:36 devstack nova-compute[86443]: DEBUG nova.objects.base [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Object Instance<462f0fe5-72f4-445d-8e8d-b9fd3a9c0735> lazy-loaded attributes: pci_requests,trusted_certs,migration_context {{(pid=86443) wrapper /opt/stack/nova/nova/objects/base.py:136}} Mai 07 19:34:36 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Rebasing disk image. Mai 07 19:34:36 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:36 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.139s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:36 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Running cmd (subprocess): qemu-img rebase -b /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a -F raw /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:37 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:34:37 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:34:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:37 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:34:38 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.451s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:38 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Start building networks asynchronously for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3003}} Mai 07 19:34:38 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Allocating IP information in the background. {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2148}} Mai 07 19:34:38 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] allocate_for_instance() {{(pid=86443) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1187}} Mai 07 19:34:38 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:38 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.policy [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e7648a0472014bc2be2325afb69590b9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f9f94b6740934688aeea63198166dda4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=86443) authorize /opt/stack/nova/nova/policy.py:192}} Mai 07 19:34:39 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Start building block device mappings for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3038}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CMD "qemu-img rebase -b /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a -F raw /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk" returned: 0 in 3.087s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Created local disks {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5347}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Ensure instance console log exists: /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/console.log {{(pid=86443) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:5094}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Start _get_guest_xml network_info=[{"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.68", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='76723dfdc585825bc1e9988126a1f6e6',container_format='bare',created_at=2026-05-07T17:34:02Z,direct_url=,disk_format='qcow2',id=131a06d3-e6a9-43eb-96ca-9d76f6c2b34b,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-273934295-shelved',owner='c334113f19ec44068e5f9d6c5b26596d',properties=ImageMetaProps,protected=,size=53477376,status='active',tags=,updated_at=2026-05-07T17:34:11Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'image_id': 'e8633b10-b98a-4580-90f8-3091ca40fa29'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8192}} Mai 07 19:34:39 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='e8633b10-b98a-4580-90f8-3091ca40fa29', instance_meta=NovaInstanceMeta(name='tempest-AttachVolumeShelveTestJSON-server-273934295', uuid='462f0fe5-72f4-445d-8e8d-b9fd3a9c0735'), owner=OwnerMeta(userid='93c67b1b99f94c92bc013de9e4ea5a50', username='tempest-AttachVolumeShelveTestJSON-60761633-project-member', projectid='c334113f19ec44068e5f9d6c5b26596d', projectname='tempest-AttachVolumeShelveTestJSON-60761633'), image=ImageMeta(id='131a06d3-e6a9-43eb-96ca-9d76f6c2b34b', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'ide', 'hw_disk_bus': 'virtio', 'hw_machine_type': 'pc', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.68", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175279.9200819) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Searching host: 'devstack' for CPU controller through CGroups V1... {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1783}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CPU controller missing on host. {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1793}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Searching host: 'devstack' for CPU controller through CGroups V2... {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1802}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CPU controller found on host. {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1809}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Getting desirable topologies for flavor Flavor(created_at=2026-05-07T17:26:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=192,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='76723dfdc585825bc1e9988126a1f6e6',container_format='bare',created_at=2026-05-07T17:34:02Z,direct_url=,disk_format='qcow2',id=131a06d3-e6a9-43eb-96ca-9d76f6c2b34b,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-273934295-shelved',owner='c334113f19ec44068e5f9d6c5b26596d',properties=ImageMetaProps,protected=,size=53477376,status='active',tags=,updated_at=2026-05-07T17:34:11Z,virtual_size=,visibility=), allow threads: True {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:703}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Flavor limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:488}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Image limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:492}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Flavor pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:528}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Image pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:532}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:570}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:709}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:611}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Got 1 possible topologies {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:641}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:715}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:717}} Mai 07 19:34:39 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lazy-loading 'vcpu_model' on Instance uuid 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:34:40 devstack nova-compute[86443]: DEBUG nova.objects.base [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Object Instance<462f0fe5-72f4-445d-8e8d-b9fd3a9c0735> lazy-loaded attributes: pci_requests,trusted_certs,migration_context,vcpu_model {{(pid=86443) wrapper /opt/stack/nova/nova/objects/base.py:136}} Mai 07 19:34:40 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='',created_at=2026-05-07T17:33:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-273934295',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumeshelvetestjson-server-273934295',id=4,image_ref='131a06d3-e6a9-43eb-96ca-9d76f6c2b34b',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name='tempest-keypair-1661872501',keypairs=,launch_index=0,launched_at=2026-05-07T17:33:33Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='c334113f19ec44068e5f9d6c5b26596d',ramdisk_id='',reservation_id='r-3921qkt1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-60761633',owner_user_name='tempest-AttachVolumeShelveTestJSON-60761633-project-member',shelved_at='2026-05-07T17:34:11.156251',shelved_host='devstack',shelved_image_id='131a06d3-e6a9-43eb-96ca-9d76f6c2b34b'},tags=,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:34:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93c67b1b99f94c92bc013de9e4ea5a50',uuid=462f0fe5-72f4-445d-8e8d-b9fd3a9c0735,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.68", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:34:40 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Converting VIF {"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.68", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:34:40 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:e1:f1,bridge_name='br-int',has_traffic_filtering=True,id=6c46fa4b-1cd3-4216-b294-254a49b6191c,network=Network(fd6aab17-0e41-43d3-8659-62f185eed00e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46fa4b-1c') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:34:40 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lazy-loading 'pci_devices' on Instance uuid 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:34:40 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Start spawning the instance on the hypervisor. {{(pid=86443) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2811}} Mai 07 19:34:40 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Creating instance directory {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5215}} Mai 07 19:34:40 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Creating image(s) Mai 07 19:34:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "/opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "/opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "/opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.005s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:40 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:40 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:40 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Successfully created port: a08e4518-622a-4bbf-a827-09c8e40411fd {{(pid=86443) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:529}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG nova.objects.base [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Object Instance<462f0fe5-72f4-445d-8e8d-b9fd3a9c0735> lazy-loaded attributes: pci_requests,trusted_certs,migration_context,vcpu_model,pci_devices {{(pid=86443) wrapper /opt/stack/nova/nova/objects/base.py:136}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] End _get_guest_xml xml= Mai 07 19:34:41 devstack nova-compute[86443]: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 Mai 07 19:34:41 devstack nova-compute[86443]: instance-00000004 Mai 07 19:34:41 devstack nova-compute[86443]: 196608 Mai 07 19:34:41 devstack nova-compute[86443]: 1 Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: tempest-AttachVolumeShelveTestJSON-server-273934295 Mai 07 19:34:41 devstack nova-compute[86443]: 2026-05-07 17:34:39 Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: 192 Mai 07 19:34:41 devstack nova-compute[86443]: 1 Mai 07 19:34:41 devstack nova-compute[86443]: 0 Mai 07 19:34:41 devstack nova-compute[86443]: 0 Mai 07 19:34:41 devstack nova-compute[86443]: 1 Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: True Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: bare Mai 07 19:34:41 devstack nova-compute[86443]: qcow2 Mai 07 19:34:41 devstack nova-compute[86443]: 1 Mai 07 19:34:41 devstack nova-compute[86443]: 0 Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: ide Mai 07 19:34:41 devstack nova-compute[86443]: virtio Mai 07 19:34:41 devstack nova-compute[86443]: pc Mai 07 19:34:41 devstack nova-compute[86443]: virtio Mai 07 19:34:41 devstack nova-compute[86443]: virtio Mai 07 19:34:41 devstack nova-compute[86443]: virtio Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: tempest-AttachVolumeShelveTestJSON-60761633-project-member Mai 07 19:34:41 devstack nova-compute[86443]: tempest-AttachVolumeShelveTestJSON-60761633 Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: OpenStack Foundation Mai 07 19:34:41 devstack nova-compute[86443]: OpenStack Nova Mai 07 19:34:41 devstack nova-compute[86443]: 33.1.0 Mai 07 19:34:41 devstack nova-compute[86443]: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 Mai 07 19:34:41 devstack nova-compute[86443]: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 Mai 07 19:34:41 devstack nova-compute[86443]: Virtual Machine Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: hvm Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: 1 Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: /dev/urandom Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: Mai 07 19:34:41 devstack nova-compute[86443]: {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8199}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Preparing to wait for external event network-vif-plugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:306}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='',created_at=2026-05-07T17:33:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-273934295',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumeshelvetestjson-server-273934295',id=4,image_ref='131a06d3-e6a9-43eb-96ca-9d76f6c2b34b',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name='tempest-keypair-1661872501',keypairs=,launch_index=0,launched_at=2026-05-07T17:33:33Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='c334113f19ec44068e5f9d6c5b26596d',ramdisk_id='',reservation_id='r-3921qkt1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-60761633',owner_user_name='tempest-AttachVolumeShelveTestJSON-60761633-project-member',shelved_at='2026-05-07T17:34:11.156251',shelved_host='devstack',shelved_image_id='131a06d3-e6a9-43eb-96ca-9d76f6c2b34b'},tags=,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:34:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93c67b1b99f94c92bc013de9e4ea5a50',uuid=462f0fe5-72f4-445d-8e8d-b9fd3a9c0735,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.68", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) plug /opt/stack/nova/nova/virt/libvirt/vif.py:763}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Converting VIF {"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.68", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:e1:f1,bridge_name='br-int',has_traffic_filtering=True,id=6c46fa4b-1cd3-4216-b294-254a49b6191c,network=Network(fd6aab17-0e41-43d3-8659-62f185eed00e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46fa4b-1c') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG os_vif [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:e1:f1,bridge_name='br-int',has_traffic_filtering=True,id=6c46fa4b-1cd3-4216-b294-254a49b6191c,network=Network(fd6aab17-0e41-43d3-8659-62f185eed00e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46fa4b-1c') {{(pid=86443) plug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:76}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'e236f7a0-d75a-596d-a889-a7a8395353a4', '_type': 'linux-noop'}}, row=False) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c46fa4b-1c, may_exist=True, interface_attrs={}) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap6c46fa4b-1c, col_values=(('qos', UUID('97ee5f69-7284-4af2-984c-9aec171e4c3e')),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap6c46fa4b-1c, col_values=(('external_ids', {'iface-id': '6c46fa4b-1cd3-4216-b294-254a49b6191c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:e1:f1', 'vm-uuid': '462f0fe5-72f4-445d-8e8d-b9fd3a9c0735'}),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.197s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "d8d56ca44922efe85609619d01052c20f44c056a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:34:41 devstack nova-compute[86443]: INFO os_vif [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:e1:f1,bridge_name='br-int',has_traffic_filtering=True,id=6c46fa4b-1cd3-4216-b294-254a49b6191c,network=Network(fd6aab17-0e41-43d3-8659-62f185eed00e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46fa4b-1c') Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.146s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1/disk 1073741824 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1/disk 1073741824" returned: 0 in 0.059s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.244s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.135s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Checking if we can resize image /opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1/disk. size=1073741824 {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:178}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Cannot resize image /opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1/disk to a smaller size. {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:184}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Created local disks {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5347}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Ensure instance console log exists: /opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1/console.log {{(pid=86443) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:5094}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:42 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Successfully updated port: a08e4518-622a-4bbf-a827-09c8e40411fd {{(pid=86443) _update_port /opt/stack/nova/nova/network/neutron.py:567}} Mai 07 19:34:42 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-79921f60-1841-4f7e-a55f-9579006ae82d req-1072b94b-7c4e-4a05-a1e0-447b5369c8f3 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Received event network-changed-a08e4518-622a-4bbf-a827-09c8e40411fd {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:42 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-79921f60-1841-4f7e-a55f-9579006ae82d req-1072b94b-7c4e-4a05-a1e0-447b5369c8f3 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Refreshing instance network info cache due to event network-changed-a08e4518-622a-4bbf-a827-09c8e40411fd. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:34:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-79921f60-1841-4f7e-a55f-9579006ae82d req-1072b94b-7c4e-4a05-a1e0-447b5369c8f3 service nova] Acquiring lock "refresh_cache-afbd8303-0cb9-4d5c-a79c-5912ce5d55f1" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:34:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-79921f60-1841-4f7e-a55f-9579006ae82d req-1072b94b-7c4e-4a05-a1e0-447b5369c8f3 service nova] Acquired lock "refresh_cache-afbd8303-0cb9-4d5c-a79c-5912ce5d55f1" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:34:42 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-79921f60-1841-4f7e-a55f-9579006ae82d req-1072b94b-7c4e-4a05-a1e0-447b5369c8f3 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Refreshing network info cache for port a08e4518-622a-4bbf-a827-09c8e40411fd {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:34:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "refresh_cache-afbd8303-0cb9-4d5c-a79c-5912ce5d55f1" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:34:42 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-79921f60-1841-4f7e-a55f-9579006ae82d req-1072b94b-7c4e-4a05-a1e0-447b5369c8f3 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:42 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-79921f60-1841-4f7e-a55f-9579006ae82d req-1072b94b-7c4e-4a05-a1e0-447b5369c8f3 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:34:42 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:34:42 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] No VIF found with MAC fa:16:3e:eb:e1:f1, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:34:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:42 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-79921f60-1841-4f7e-a55f-9579006ae82d req-1072b94b-7c4e-4a05-a1e0-447b5369c8f3 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:34:43 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-28fa4529-1642-4673-a1ef-d04e1cb389a9 req-d7628c2d-6e0e-4d93-87d2-f95a8c90b8cd service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received event network-vif-plugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:43 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-28fa4529-1642-4673-a1ef-d04e1cb389a9 req-d7628c2d-6e0e-4d93-87d2-f95a8c90b8cd service nova] Acquiring lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:43 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-28fa4529-1642-4673-a1ef-d04e1cb389a9 req-d7628c2d-6e0e-4d93-87d2-f95a8c90b8cd service nova] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:43 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-28fa4529-1642-4673-a1ef-d04e1cb389a9 req-d7628c2d-6e0e-4d93-87d2-f95a8c90b8cd service nova] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:43 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-28fa4529-1642-4673-a1ef-d04e1cb389a9 req-d7628c2d-6e0e-4d93-87d2-f95a8c90b8cd service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Processing event network-vif-plugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11746}} Mai 07 19:34:43 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-79921f60-1841-4f7e-a55f-9579006ae82d req-1072b94b-7c4e-4a05-a1e0-447b5369c8f3 service nova] Releasing lock "refresh_cache-afbd8303-0cb9-4d5c-a79c-5912ce5d55f1" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:34:43 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:43 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquired lock "refresh_cache-afbd8303-0cb9-4d5c-a79c-5912ce5d55f1" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:34:43 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:34:43 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:43 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:43 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Started> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:34:44 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] VM Started (Lifecycle Event) Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=86443) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:601}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Guest created on hypervisor {{(pid=86443) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4893}} Mai 07 19:34:44 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Instance spawned successfully. Mai 07 19:34:44 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Updating instance_info_cache with network_info: [{"id": "a08e4518-622a-4bbf-a827-09c8e40411fd", "address": "fa:16:3e:94:d9:67", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08e4518-62", "ovs_interfaceid": "a08e4518-622a-4bbf-a827-09c8e40411fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Releasing lock "refresh_cache-afbd8303-0cb9-4d5c-a79c-5912ce5d55f1" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Instance network_info: |[{"id": "a08e4518-622a-4bbf-a827-09c8e40411fd", "address": "fa:16:3e:94:d9:67", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08e4518-62", "ovs_interfaceid": "a08e4518-622a-4bbf-a827-09c8e40411fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2163}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Start _get_guest_xml network_info=[{"id": "a08e4518-622a-4bbf-a827-09c8e40411fd", "address": "fa:16:3e:94:d9:67", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08e4518-62", "ovs_interfaceid": "a08e4518-622a-4bbf-a827-09c8e40411fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'image_id': 'e8633b10-b98a-4580-90f8-3091ca40fa29'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8192}} Mai 07 19:34:44 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='e8633b10-b98a-4580-90f8-3091ca40fa29', instance_meta=NovaInstanceMeta(name='tempest-ServersNegativeTestJSON-server-1571814512', uuid='afbd8303-0cb9-4d5c-a79c-5912ce5d55f1'), owner=OwnerMeta(userid='e7648a0472014bc2be2325afb69590b9', username='tempest-ServersNegativeTestJSON-1436591115-project-member', projectid='f9f94b6740934688aeea63198166dda4', projectname='tempest-ServersNegativeTestJSON-1436591115'), image=ImageMeta(id='e8633b10-b98a-4580-90f8-3091ca40fa29', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "a08e4518-622a-4bbf-a827-09c8e40411fd", "address": "fa:16:3e:94:d9:67", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08e4518-62", "ovs_interfaceid": "a08e4518-622a-4bbf-a827-09c8e40411fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175284.8886914) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Searching host: 'devstack' for CPU controller through CGroups V1... {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1783}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] CPU controller missing on host. {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1793}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Searching host: 'devstack' for CPU controller through CGroups V2... {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1802}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] CPU controller found on host. {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1809}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Getting desirable topologies for flavor Flavor(created_at=2026-05-07T17:26:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=192,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=), allow threads: True {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:703}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Flavor limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:488}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Image limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:492}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Flavor pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:528}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Image pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:532}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:570}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:709}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:611}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Got 1 possible topologies {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:641}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:715}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:717}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:34:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1571814512',display_name='tempest-ServersNegativeTestJSON-server-1571814512',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-serversnegativetestjson-server-1571814512',id=7,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f9f94b6740934688aeea63198166dda4',ramdisk_id='',reservation_id='r-8vge1k3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1436591115',owner_user_name='tempest-ServersNegativeTestJSON-1436591115-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:34:40Z,user_data=None,user_id='e7648a0472014bc2be2325afb69590b9',uuid=afbd8303-0cb9-4d5c-a79c-5912ce5d55f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a08e4518-622a-4bbf-a827-09c8e40411fd", "address": "fa:16:3e:94:d9:67", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08e4518-62", "ovs_interfaceid": "a08e4518-622a-4bbf-a827-09c8e40411fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Converting VIF {"id": "a08e4518-622a-4bbf-a827-09c8e40411fd", "address": "fa:16:3e:94:d9:67", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08e4518-62", "ovs_interfaceid": "a08e4518-622a-4bbf-a827-09c8e40411fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:d9:67,bridge_name='br-int',has_traffic_filtering=True,id=a08e4518-622a-4bbf-a827-09c8e40411fd,network=Network(d7a1572e-15cd-4740-b77a-a3a68ea38663),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa08e4518-62') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:34:44 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lazy-loading 'pci_devices' on Instance uuid afbd8303-0cb9-4d5c-a79c-5912ce5d55f1 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-8be898c7-c262-4642-aff2-5a81c36ccab5 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" "released" by "nova.compute.manager.ComputeManager.unshelve_instance..do_unshelve_instance" :: held 23.239s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Paused> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:34:45 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] VM Paused (Lifecycle Event) Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-4fcb3375-be03-49f3-afae-1b255e1b61aa req-f5c1a4ec-269a-43b9-8956-c9148a6d5fe5 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received event network-vif-plugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-4fcb3375-be03-49f3-afae-1b255e1b61aa req-f5c1a4ec-269a-43b9-8956-c9148a6d5fe5 service nova] Acquiring lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-4fcb3375-be03-49f3-afae-1b255e1b61aa req-f5c1a4ec-269a-43b9-8956-c9148a6d5fe5 service nova] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-4fcb3375-be03-49f3-afae-1b255e1b61aa req-f5c1a4ec-269a-43b9-8956-c9148a6d5fe5 service nova] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-4fcb3375-be03-49f3-afae-1b255e1b61aa req-f5c1a4ec-269a-43b9-8956-c9148a6d5fe5 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] No waiting events found dispatching network-vif-plugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:34:45 devstack nova-compute[86443]: WARNING nova.compute.manager [req-4fcb3375-be03-49f3-afae-1b255e1b61aa req-f5c1a4ec-269a-43b9-8956-c9148a6d5fe5 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received unexpected event network-vif-plugged-6c46fa4b-1cd3-4216-b294-254a49b6191c for instance with vm_state active and task_state None. Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] End _get_guest_xml xml= Mai 07 19:34:45 devstack nova-compute[86443]: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1 Mai 07 19:34:45 devstack nova-compute[86443]: instance-00000007 Mai 07 19:34:45 devstack nova-compute[86443]: 196608 Mai 07 19:34:45 devstack nova-compute[86443]: 1 Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: tempest-ServersNegativeTestJSON-server-1571814512 Mai 07 19:34:45 devstack nova-compute[86443]: 2026-05-07 17:34:44 Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: 192 Mai 07 19:34:45 devstack nova-compute[86443]: 1 Mai 07 19:34:45 devstack nova-compute[86443]: 0 Mai 07 19:34:45 devstack nova-compute[86443]: 0 Mai 07 19:34:45 devstack nova-compute[86443]: 1 Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: True Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: bare Mai 07 19:34:45 devstack nova-compute[86443]: qcow2 Mai 07 19:34:45 devstack nova-compute[86443]: 1 Mai 07 19:34:45 devstack nova-compute[86443]: 0 Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: virtio Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: tempest-ServersNegativeTestJSON-1436591115-project-member Mai 07 19:34:45 devstack nova-compute[86443]: tempest-ServersNegativeTestJSON-1436591115 Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: OpenStack Foundation Mai 07 19:34:45 devstack nova-compute[86443]: OpenStack Nova Mai 07 19:34:45 devstack nova-compute[86443]: 33.1.0 Mai 07 19:34:45 devstack nova-compute[86443]: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1 Mai 07 19:34:45 devstack nova-compute[86443]: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1 Mai 07 19:34:45 devstack nova-compute[86443]: Virtual Machine Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: hvm Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: 1 Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: /dev/urandom Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: Mai 07 19:34:45 devstack nova-compute[86443]: {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8199}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Preparing to wait for external event network-vif-plugged-a08e4518-622a-4bbf-a827-09c8e40411fd {{(pid=86443) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:306}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:34:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1571814512',display_name='tempest-ServersNegativeTestJSON-server-1571814512',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-serversnegativetestjson-server-1571814512',id=7,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f9f94b6740934688aeea63198166dda4',ramdisk_id='',reservation_id='r-8vge1k3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1436591115',owner_user_name='tempest-ServersNegativeTestJSON-1436591115-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:34:40Z,user_data=None,user_id='e7648a0472014bc2be2325afb69590b9',uuid=afbd8303-0cb9-4d5c-a79c-5912ce5d55f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a08e4518-622a-4bbf-a827-09c8e40411fd", "address": "fa:16:3e:94:d9:67", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08e4518-62", "ovs_interfaceid": "a08e4518-622a-4bbf-a827-09c8e40411fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) plug /opt/stack/nova/nova/virt/libvirt/vif.py:763}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Converting VIF {"id": "a08e4518-622a-4bbf-a827-09c8e40411fd", "address": "fa:16:3e:94:d9:67", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08e4518-62", "ovs_interfaceid": "a08e4518-622a-4bbf-a827-09c8e40411fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:d9:67,bridge_name='br-int',has_traffic_filtering=True,id=a08e4518-622a-4bbf-a827-09c8e40411fd,network=Network(d7a1572e-15cd-4740-b77a-a3a68ea38663),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa08e4518-62') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG os_vif [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:d9:67,bridge_name='br-int',has_traffic_filtering=True,id=a08e4518-622a-4bbf-a827-09c8e40411fd,network=Network(d7a1572e-15cd-4740-b77a-a3a68ea38663),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa08e4518-62') {{(pid=86443) plug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:76}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'db091538-8f20-5183-b544-e2e4d8a7dc64', '_type': 'linux-noop'}}, row=False) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa08e4518-62, may_exist=True, interface_attrs={}) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa08e4518-62, col_values=(('qos', UUID('7f6940f4-9295-4f2c-8fbf-8fb92bec69ae')),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa08e4518-62, col_values=(('external_ids', {'iface-id': 'a08e4518-622a-4bbf-a827-09c8e40411fd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:94:d9:67', 'vm-uuid': 'afbd8303-0cb9-4d5c-a79c-5912ce5d55f1'}),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:45 devstack nova-compute[86443]: INFO os_vif [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:d9:67,bridge_name='br-int',has_traffic_filtering=True,id=a08e4518-622a-4bbf-a827-09c8e40411fd,network=Network(d7a1572e-15cd-4740-b77a-a3a68ea38663),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa08e4518-62') Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:45 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Resumed> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:34:45 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] VM Resumed (Lifecycle Event) Mai 07 19:34:46 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:46 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:34:46 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:47 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:34:47 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] No VIF found with MAC fa:16:3e:94:d9:67, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:34:47 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:47 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:47 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:47 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-dfe13eff-883b-4514-a5cb-b1c0a12afed9 req-e63b7ef7-602a-4bbb-9e52-546414269557 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Received event network-vif-plugged-a08e4518-622a-4bbf-a827-09c8e40411fd {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-dfe13eff-883b-4514-a5cb-b1c0a12afed9 req-e63b7ef7-602a-4bbb-9e52-546414269557 service nova] Acquiring lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-dfe13eff-883b-4514-a5cb-b1c0a12afed9 req-e63b7ef7-602a-4bbb-9e52-546414269557 service nova] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-dfe13eff-883b-4514-a5cb-b1c0a12afed9 req-e63b7ef7-602a-4bbb-9e52-546414269557 service nova] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.019s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:47 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-dfe13eff-883b-4514-a5cb-b1c0a12afed9 req-e63b7ef7-602a-4bbb-9e52-546414269557 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Processing event network-vif-plugged-a08e4518-622a-4bbf-a827-09c8e40411fd {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11746}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Started> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:34:49 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] VM Started (Lifecycle Event) Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=86443) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:601}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Guest created on hypervisor {{(pid=86443) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4893}} Mai 07 19:34:49 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Instance spawned successfully. Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1012}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=86443) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:11402}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager.update_available_resource {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Found default for hw_cdrom_bus of ide {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Found default for hw_disk_bus of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Found default for hw_input_bus of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Found default for hw_pointer_model of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Found default for hw_video_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Found default for hw_vif_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.016s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.012s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Auditing locally available compute resources for devstack (node: devstack) {{(pid=86443) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:937}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-3f0270d9-45c1-4462-affc-b92c46dd9244 req-a9270ffb-38e7-42ca-96c6-d8b6e620acd5 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Received event network-vif-plugged-a08e4518-622a-4bbf-a827-09c8e40411fd {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-3f0270d9-45c1-4462-affc-b92c46dd9244 req-a9270ffb-38e7-42ca-96c6-d8b6e620acd5 service nova] Acquiring lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-3f0270d9-45c1-4462-affc-b92c46dd9244 req-a9270ffb-38e7-42ca-96c6-d8b6e620acd5 service nova] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-3f0270d9-45c1-4462-affc-b92c46dd9244 req-a9270ffb-38e7-42ca-96c6-d8b6e620acd5 service nova] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:49 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-3f0270d9-45c1-4462-affc-b92c46dd9244 req-a9270ffb-38e7-42ca-96c6-d8b6e620acd5 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] No waiting events found dispatching network-vif-plugged-a08e4518-622a-4bbf-a827-09c8e40411fd {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:34:49 devstack nova-compute[86443]: WARNING nova.compute.manager [req-3f0270d9-45c1-4462-affc-b92c46dd9244 req-a9270ffb-38e7-42ca-96c6-d8b6e620acd5 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Received unexpected event network-vif-plugged-a08e4518-622a-4bbf-a827-09c8e40411fd for instance with vm_state building and task_state spawning. Mai 07 19:34:50 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] During sync_power_state the instance has a pending task (spawning). Skip. Mai 07 19:34:50 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Paused> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:34:50 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] VM Paused (Lifecycle Event) Mai 07 19:34:50 devstack nova-compute[86443]: INFO nova.compute.manager [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Took 9.47 seconds to spawn the instance on the hypervisor. Mai 07 19:34:50 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:50 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:50 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:50 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Resumed> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:34:50 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] VM Resumed (Lifecycle Event) Mai 07 19:34:50 devstack nova-compute[86443]: INFO nova.compute.manager [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Took 15.22 seconds to build instance. Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk --force-share --output=json" returned: 0 in 0.328s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.788s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735/disk --force-share --output=json" returned: 0 in 0.252s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:51 devstack nova-compute[86443]: INFO nova.compute.manager [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Terminating instance Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:51 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] During sync_power_state the instance has a pending task (deleting). Skip. Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505/disk --force-share --output=json" returned: 0 in 0.173s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fce02c77-7e4b-4107-bca0-3e4453fd67f8/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fce02c77-7e4b-4107-bca0-3e4453fd67f8/disk --force-share --output=json" returned: 0 in 0.161s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fce02c77-7e4b-4107-bca0-3e4453fd67f8/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:52 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Start destroying the instance on the hypervisor. {{(pid=86443) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3332}} Mai 07 19:34:52 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fce02c77-7e4b-4107-bca0-3e4453fd67f8/disk --force-share --output=json" returned: 0 in 0.205s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:52 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:52 devstack nova-compute[86443]: DEBUG nova.utils [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Queued Task(fn=>, remaining_delay=14.99924487699991 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:34:52 devstack nova-compute[86443]: DEBUG nova.utils [-] Received Task(fn=>, remaining_delay=14.983208765999734 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:34:52 devstack nova-compute[86443]: DEBUG nova.utils [-] Waitig for the deadline of Task(fn=>, remaining_delay=14.982844193999881 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:34:52 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Instance destroyed successfully. Mai 07 19:34:52 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lazy-loading 'resources' on Instance uuid afbd8303-0cb9-4d5c-a79c-5912ce5d55f1 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:34:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1/disk --force-share --output=json" returned: 0 in 0.457s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:52 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-d2667303-d9ff-47a1-b2dd-30f9b6f5dcd9 req-56d2a36f-388c-4de8-a507-30ca14ca7ab9 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Received event network-vif-unplugged-a08e4518-622a-4bbf-a827-09c8e40411fd {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-d2667303-d9ff-47a1-b2dd-30f9b6f5dcd9 req-56d2a36f-388c-4de8-a507-30ca14ca7ab9 service nova] Acquiring lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-d2667303-d9ff-47a1-b2dd-30f9b6f5dcd9 req-56d2a36f-388c-4de8-a507-30ca14ca7ab9 service nova] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-d2667303-d9ff-47a1-b2dd-30f9b6f5dcd9 req-56d2a36f-388c-4de8-a507-30ca14ca7ab9 service nova] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.024s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-d2667303-d9ff-47a1-b2dd-30f9b6f5dcd9 req-56d2a36f-388c-4de8-a507-30ca14ca7ab9 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] No waiting events found dispatching network-vif-unplugged-a08e4518-622a-4bbf-a827-09c8e40411fd {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-d2667303-d9ff-47a1-b2dd-30f9b6f5dcd9 req-56d2a36f-388c-4de8-a507-30ca14ca7ab9 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Received event network-vif-unplugged-a08e4518-622a-4bbf-a827-09c8e40411fd for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:34:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1571814512',display_name='tempest-ServersNegativeTestJSON-server-1571814512',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-serversnegativetestjson-server-1571814512',id=7,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-05-07T17:34:50Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='f9f94b6740934688aeea63198166dda4',ramdisk_id='',reservation_id='r-8vge1k3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-1436591115',owner_user_name='tempest-ServersNegativeTestJSON-1436591115-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2026-05-07T17:34:51Z,user_data=None,user_id='e7648a0472014bc2be2325afb69590b9',uuid=afbd8303-0cb9-4d5c-a79c-5912ce5d55f1,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a08e4518-622a-4bbf-a827-09c8e40411fd", "address": "fa:16:3e:94:d9:67", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08e4518-62", "ovs_interfaceid": "a08e4518-622a-4bbf-a827-09c8e40411fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:881}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Converting VIF {"id": "a08e4518-622a-4bbf-a827-09c8e40411fd", "address": "fa:16:3e:94:d9:67", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08e4518-62", "ovs_interfaceid": "a08e4518-622a-4bbf-a827-09c8e40411fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:d9:67,bridge_name='br-int',has_traffic_filtering=True,id=a08e4518-622a-4bbf-a827-09c8e40411fd,network=Network(d7a1572e-15cd-4740-b77a-a3a68ea38663),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa08e4518-62') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG os_vif [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:d9:67,bridge_name='br-int',has_traffic_filtering=True,id=a08e4518-622a-4bbf-a827-09c8e40411fd,network=Network(d7a1572e-15cd-4740-b77a-a3a68ea38663),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa08e4518-62') {{(pid=86443) unplug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:109}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa08e4518-62, bridge=br-int, if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=7f6940f4-9295-4f2c-8fbf-8fb92bec69ae) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:34:53 devstack nova-compute[86443]: INFO os_vif [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:d9:67,bridge_name='br-int',has_traffic_filtering=True,id=a08e4518-622a-4bbf-a827-09c8e40411fd,network=Network(d7a1572e-15cd-4740-b77a-a3a68ea38663),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa08e4518-62') Mai 07 19:34:53 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Deleting instance files /opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1_del Mai 07 19:34:53 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Deletion of /opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1_del complete Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1/disk --force-share --output=json" returned: 1 in 0.369s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] '/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1/disk --force-share --output=json' failed. Not Retrying. {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:543}} Mai 07 19:34:53 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Periodic task is updating the host stats, it is trying to get disk info for instance-00000007, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1/disk: nova.exception.DiskNotFound: No disk at /opt/stack/data/nova/instances/afbd8303-0cb9-4d5c-a79c-5912ce5d55f1/disk Mai 07 19:34:53 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): env LANG=C uptime {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "env LANG=C uptime" returned: 0 in 0.092s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Hypervisor/Node resource view: name=devstack free_ram=4842MB free_disk=14.626869201660156GB free_vcpus=0 pci_devices=[{"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_00_0", "address": "0000:02:00.0", "product_id": "000d", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000d", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1111", "vendor_id": "1234", "numa_node": null, "label": "label_1234_1111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1043", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1043", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] {{(pid=86443) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1136}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:53 devstack nova-compute[86443]: INFO nova.compute.manager [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Took 1.58 seconds to destroy the instance on the hypervisor. Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG oslo.service.backend._common.loopingcall [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=86443) func /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_common/loopingcall.py:419}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG nova.compute.manager [-] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Deallocating network for instance {{(pid=86443) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2456}} Mai 07 19:34:53 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] deallocate_for_instance() {{(pid=86443) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1842}} Mai 07 19:34:53 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:53 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:34:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:54 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Instance 2f733e5d-791b-4963-a6f4-4e64ea8da505 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}. {{(pid=86443) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1740}} Mai 07 19:34:54 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Instance fce02c77-7e4b-4107-bca0-3e4453fd67f8 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}. {{(pid=86443) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1740}} Mai 07 19:34:54 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Instance 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}. {{(pid=86443) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1740}} Mai 07 19:34:54 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Instance afbd8303-0cb9-4d5c-a79c-5912ce5d55f1 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}. {{(pid=86443) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1740}} Mai 07 19:34:54 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Total usable vcpus: 4, total allocated vcpus: 4 {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1159}} Mai 07 19:34:54 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Final resource view: name=devstack phys_ram=11961MB used_ram=1280MB phys_disk=25GB used_disk=4GB total_vcpus=4 used_vcpus=4 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:34:53 up 34 min, 1 user, load average: 8.96, 5.12, 3.01\n', 'num_instances': '4', 'num_vm_active': '4', 'num_task_None': '3', 'num_os_type_None': '4', 'num_proj_c334113f19ec44068e5f9d6c5b26596d': '1', 'io_workload': '0', 'num_proj_d215b3694bec42629ea172bb6d560623': '1', 'num_proj_f9f94b6740934688aeea63198166dda4': '2', 'num_task_deleting': '1'} {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1168}} Mai 07 19:34:54 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:34:54 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:34:54 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [-] Fixed interval looping call 'nova.servicegroup.drivers.db.DbDriver._report_state' sleeping for 119.49 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-fa23262e-bbdd-4857-a373-4230801ce3f7 req-27da826f-f928-43e9-93e0-124fc3bdbec0 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Received event network-vif-unplugged-a08e4518-622a-4bbf-a827-09c8e40411fd {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-fa23262e-bbdd-4857-a373-4230801ce3f7 req-27da826f-f928-43e9-93e0-124fc3bdbec0 service nova] Acquiring lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-fa23262e-bbdd-4857-a373-4230801ce3f7 req-27da826f-f928-43e9-93e0-124fc3bdbec0 service nova] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.010s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-fa23262e-bbdd-4857-a373-4230801ce3f7 req-27da826f-f928-43e9-93e0-124fc3bdbec0 service nova] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-fa23262e-bbdd-4857-a373-4230801ce3f7 req-27da826f-f928-43e9-93e0-124fc3bdbec0 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] No waiting events found dispatching network-vif-unplugged-a08e4518-622a-4bbf-a827-09c8e40411fd {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-fa23262e-bbdd-4857-a373-4230801ce3f7 req-27da826f-f928-43e9-93e0-124fc3bdbec0 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Received event network-vif-unplugged-a08e4518-622a-4bbf-a827-09c8e40411fd for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-fa23262e-bbdd-4857-a373-4230801ce3f7 req-27da826f-f928-43e9-93e0-124fc3bdbec0 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Received event network-vif-plugged-a08e4518-622a-4bbf-a827-09c8e40411fd {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-fa23262e-bbdd-4857-a373-4230801ce3f7 req-27da826f-f928-43e9-93e0-124fc3bdbec0 service nova] Acquiring lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-fa23262e-bbdd-4857-a373-4230801ce3f7 req-27da826f-f928-43e9-93e0-124fc3bdbec0 service nova] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-fa23262e-bbdd-4857-a373-4230801ce3f7 req-27da826f-f928-43e9-93e0-124fc3bdbec0 service nova] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-fa23262e-bbdd-4857-a373-4230801ce3f7 req-27da826f-f928-43e9-93e0-124fc3bdbec0 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] No waiting events found dispatching network-vif-plugged-a08e4518-622a-4bbf-a827-09c8e40411fd {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:34:55 devstack nova-compute[86443]: WARNING nova.compute.manager [req-fa23262e-bbdd-4857-a373-4230801ce3f7 req-27da826f-f928-43e9-93e0-124fc3bdbec0 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Received unexpected event network-vif-plugged-a08e4518-622a-4bbf-a827-09c8e40411fd for instance with vm_state active and task_state deleting. Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-fa23262e-bbdd-4857-a373-4230801ce3f7 req-27da826f-f928-43e9-93e0-124fc3bdbec0 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Received event network-vif-deleted-a08e4518-622a-4bbf-a827-09c8e40411fd {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:34:55 devstack nova-compute[86443]: INFO nova.compute.manager [req-fa23262e-bbdd-4857-a373-4230801ce3f7 req-27da826f-f928-43e9-93e0-124fc3bdbec0 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Neutron deleted interface a08e4518-622a-4bbf-a827-09c8e40411fd; detaching it from the instance and deleting it from the info cache Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-fa23262e-bbdd-4857-a373-4230801ce3f7 req-27da826f-f928-43e9-93e0-124fc3bdbec0 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Acquiring lock "2f733e5d-791b-4963-a6f4-4e64ea8da505" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-fa23262e-bbdd-4857-a373-4230801ce3f7 req-27da826f-f928-43e9-93e0-124fc3bdbec0 service nova] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Detach interface failed, port_id=a08e4518-622a-4bbf-a827-09c8e40411fd, reason: Instance afbd8303-0cb9-4d5c-a79c-5912ce5d55f1 could not be found. {{(pid=86443) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:11820}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Compute_service record updated for devstack:devstack {{(pid=86443) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1097}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.490s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 44.22 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:34:55 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lazy-loading 'flavor' on Instance uuid 2f733e5d-791b-4963-a6f4-4e64ea8da505 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:34:56 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Took 2.37 seconds to deallocate network for instance. Mai 07 19:34:56 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:56 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:56 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:56 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:34:56 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:34:56 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 1.521s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:57 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:34:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.312s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:58 devstack nova-compute[86443]: INFO nova.scheduler.client.report [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Deleted allocations for instance afbd8303-0cb9-4d5c-a79c-5912ce5d55f1 Mai 07 19:34:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Acquiring lock "2f733e5d-791b-4963-a6f4-4e64ea8da505" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:58 devstack nova-compute[86443]: INFO nova.compute.manager [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Attaching volume 77061e0c-3c7b-4eb7-8b80-0029c6bae21f to /dev/sdc Mai 07 19:34:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:34:58 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.166', 'multipath': False, 'enforce_multipath': True, 'host': 'devstack', 'execute': None}" {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:175}} Mai 07 19:34:58 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.scaleio [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Failed to query sdc guid: [Errno 2] No such file or directory {{(pid=86443) _get_guid /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/scaleio.py:91}} Mai 07 19:34:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Running cmd (subprocess): nvme version {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:34:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] 'nvme version' failed. Not Retrying. {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:543}} Mai 07 19:34:58 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.nvmeof [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] nvme not present on system {{(pid=86443) nvme_present /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/nvmeof.py:782}} Mai 07 19:34:58 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] LIGHTOS: [Errno 111] Connection refused {{(pid=86443) find_dsc /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:161}} Mai 07 19:34:58 devstack nova-compute[86443]: INFO os_brick.initiator.connectors.lightos [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Current host hostNQN and IP(s) are ['192.168.122.166', 'fe80::5054:ff:fe02:c903', '172.24.4.1', '2001:db8::2', 'fe80::e4ed:27ff:fed0:dc45', 'fe80::fc16:3eff:fe73:734f', 'fe80::9cb3:49ff:fec0:4cab', 'fe80::fc16:3eff:feb2:9994', 'fe80::5ca9:ecff:fe75:d6a2', 'fe80::fc16:3eff:feeb:e1f1', 'fe80::c060:9bff:fec1:af03'] Mai 07 19:34:58 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] LIGHTOS: did not find dsc, continuing anyway. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:136}} Mai 07 19:34:58 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] LIGHTOS: no hostnqn found. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:145}} Mai 07 19:34:58 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] <== get_connector_properties: return (105ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.166', 'host': 'devstack', 'multipath': False, 'enforce_multipath': True, 'initiator': 'iqn.2016-04.com.open-iscsi:ab61f14d7e3e', 'do_local_attach': False, 'uuid': 'e51d0ed5-0776-4376-a81f-1e084ffcb1c6', 'system uuid': '1edef36a-6b3a-4b67-b01c-d6a682c117a8', 'nvme_native_multipath': False} {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:202}} Mai 07 19:34:58 devstack nova-compute[86443]: DEBUG nova.virt.block_device [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Updating existing volume attachment record: cfd47248-c738-4cf9-94bf-2a9167f69286 {{(pid=86443) _volume_attach /opt/stack/nova/nova/virt/block_device.py:666}} Mai 07 19:34:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2261b335-d7ee-4759-989c-fc8f2a083a96 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "afbd8303-0cb9-4d5c-a79c-5912ce5d55f1" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 7.423s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:34:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:34:59 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.volume.quobyte [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] systemd detected. {{(pid=86443) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/quobyte.py:167}} Mai 07 19:34:59 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.volume.quobyte [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Mounting volume osci02.corp.quobyte.com/cinder-vol-1d971a4c-1ce1-46c7-a94d-347e695e16aa at mount point /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22 via systemd-run {{(pid=86443) mount_volume /opt/stack/nova/nova/virt/libvirt/volume/quobyte.py:79}} Mai 07 19:34:59 devstack nova-compute[86443]: INFO nova.virt.libvirt.volume.quobyte [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Mounted volume: osci02.corp.quobyte.com/cinder-vol-1d971a4c-1ce1-46c7-a94d-347e695e16aa Mai 07 19:34:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: held 0.232s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:34:59 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lazy-loading 'flavor' on Instance uuid 2f733e5d-791b-4963-a6f4-4e64ea8da505 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:35:00 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] attach device xml: Mai 07 19:35:00 devstack nova-compute[86443]: Mai 07 19:35:00 devstack nova-compute[86443]: Mai 07 19:35:00 devstack nova-compute[86443]: Mai 07 19:35:00 devstack nova-compute[86443]: Mai 07 19:35:00 devstack nova-compute[86443]: 77061e0c-3c7b-4eb7-8b80-0029c6bae21f Mai 07 19:35:00 devstack nova-compute[86443]:
Mai 07 19:35:00 devstack nova-compute[86443]: Mai 07 19:35:00 devstack nova-compute[86443]: {{(pid=86443) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:351}} Mai 07 19:35:01 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:02 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] No BDM found with device name sda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:35:02 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] No BDM found with device name sdb, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:35:02 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] No BDM found with device name sdc, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:35:02 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] No VIF found with MAC fa:16:3e:73:73:4f, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:35:03 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5228fc87-8900-4df7-95c5-c1a586c6ecc2 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505" "released" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 5.544s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Acquiring lock "2f733e5d-791b-4963-a6f4-4e64ea8da505" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505" acquired by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:04 devstack nova-compute[86443]: INFO nova.compute.manager [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Detaching volume 77061e0c-3c7b-4eb7-8b80-0029c6bae21f Mai 07 19:35:04 devstack nova-compute[86443]: INFO nova.virt.block_device [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Attempting to driver detach volume 77061e0c-3c7b-4eb7-8b80-0029c6bae21f from mountpoint /dev/sdc Mai 07 19:35:04 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Found disk sdc by alias ua-77061e0c-3c7b-4eb7-8b80-0029c6bae21f {{(pid=86443) _get_guest_disk_device /opt/stack/nova/nova/virt/libvirt/driver.py:2892}} Mai 07 19:35:04 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Found disk sdc by alias ua-77061e0c-3c7b-4eb7-8b80-0029c6bae21f {{(pid=86443) _get_guest_disk_device /opt/stack/nova/nova/virt/libvirt/driver.py:2892}} Mai 07 19:35:04 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Attempting to detach device sdc from instance 2f733e5d-791b-4963-a6f4-4e64ea8da505 from the persistent domain config. {{(pid=86443) _detach_from_persistent /opt/stack/nova/nova/virt/libvirt/driver.py:2642}} Mai 07 19:35:04 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] detach device xml: Mai 07 19:35:04 devstack nova-compute[86443]: Mai 07 19:35:04 devstack nova-compute[86443]: Mai 07 19:35:04 devstack nova-compute[86443]: Mai 07 19:35:04 devstack nova-compute[86443]: Mai 07 19:35:04 devstack nova-compute[86443]: 77061e0c-3c7b-4eb7-8b80-0029c6bae21f Mai 07 19:35:04 devstack nova-compute[86443]:
Mai 07 19:35:04 devstack nova-compute[86443]: Mai 07 19:35:04 devstack nova-compute[86443]: {{(pid=86443) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:481}} Mai 07 19:35:05 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Successfully detached device sdc from instance 2f733e5d-791b-4963-a6f4-4e64ea8da505 from the persistent domain config. Mai 07 19:35:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] (1/8): Attempting to detach device sdc with device alias ua-77061e0c-3c7b-4eb7-8b80-0029c6bae21f from instance 2f733e5d-791b-4963-a6f4-4e64ea8da505 from the live domain config. {{(pid=86443) _detach_from_live_with_retry /opt/stack/nova/nova/virt/libvirt/driver.py:2676}} Mai 07 19:35:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] detach device xml: Mai 07 19:35:05 devstack nova-compute[86443]: Mai 07 19:35:05 devstack nova-compute[86443]: Mai 07 19:35:05 devstack nova-compute[86443]: Mai 07 19:35:05 devstack nova-compute[86443]: Mai 07 19:35:05 devstack nova-compute[86443]: 77061e0c-3c7b-4eb7-8b80-0029c6bae21f Mai 07 19:35:05 devstack nova-compute[86443]:
Mai 07 19:35:05 devstack nova-compute[86443]: Mai 07 19:35:05 devstack nova-compute[86443]: {{(pid=86443) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:481}} Mai 07 19:35:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Start waiting for the detach event from libvirt for device sdc with device alias ua-77061e0c-3c7b-4eb7-8b80-0029c6bae21f for instance 2f733e5d-791b-4963-a6f4-4e64ea8da505 {{(pid=86443) _detach_from_live_and_wait_for_event /opt/stack/nova/nova/virt/libvirt/driver.py:2756}} Mai 07 19:35:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Received event ua-77061e0c-3c7b-4eb7-8b80-0029c6bae21f> from libvirt while the driver is waiting for it; dispatched. {{(pid=86443) emit_event /opt/stack/nova/nova/virt/libvirt/driver.py:2529}} Mai 07 19:35:05 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Successfully detached device sdc from instance 2f733e5d-791b-4963-a6f4-4e64ea8da505 from the live domain config. Mai 07 19:35:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: held 0.020s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:05 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lazy-loading 'flavor' on Instance uuid 2f733e5d-791b-4963-a6f4-4e64ea8da505 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:35:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:06 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-de74ac76-744e-4193-bab6-1dcad993796a tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505" "released" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: held 2.510s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:07 devstack nova-compute[86443]: DEBUG nova.utils [-] Task(fn=>, remaining_delay=-0.00700910700015811 future=) submitted to {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:35:07 devstack nova-compute[86443]: DEBUG nova.utils [-] Waiting for the next task {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:35:07 devstack nova-compute[86443]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:35:07 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] VM Stopped (Lifecycle Event) Mai 07 19:35:08 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-5f07cc7e-07d3-48b7-aa99-232648f7a7fd None None] [instance: afbd8303-0cb9-4d5c-a79c-5912ce5d55f1] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:35:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Acquiring lock "2f733e5d-791b-4963-a6f4-4e64ea8da505" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Acquiring lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:08 devstack nova-compute[86443]: INFO nova.compute.manager [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Terminating instance Mai 07 19:35:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:08 devstack nova-compute[86443]: INFO nova.compute.manager [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Terminating instance Mai 07 19:35:08 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Start destroying the instance on the hypervisor. {{(pid=86443) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3332}} Mai 07 19:35:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:08 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Start destroying the instance on the hypervisor. {{(pid=86443) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3332}} Mai 07 19:35:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG nova.utils [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Queued Task(fn=>, remaining_delay=14.999534857000071 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG nova.utils [-] Received Task(fn=>, remaining_delay=14.99733709200018 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG nova.utils [-] Waitig for the deadline of Task(fn=>, remaining_delay=14.9969258670003 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:09 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Instance destroyed successfully. Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lazy-loading 'resources' on Instance uuid 2f733e5d-791b-4963-a6f4-4e64ea8da505 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG nova.utils [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Queued Task(fn=>, remaining_delay=14.999481499000012 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:35:09 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Instance destroyed successfully. Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lazy-loading 'resources' on Instance uuid 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-05-07T17:34:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-807452503',display_name='tempest-AttachSCSIVolumeTestJSON-server-807452503',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachscsivolumetestjson-server-807452503',id=5,image_ref='fe5635da-867a-46a8-bf75-e6ea9504035d',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL7h3ZPXkyoD78FcTp5+unliSvNdw2hIB7eeSTDaZlr2rEarYSadi5aDijrDAGDwS3VXO5icJHGrUQzkFPzKzm0qyHBYajDVGVeP5jSpeSG5zCr8qlH9UcWsWDqbDlagFQ==',key_name='tempest-keypair-313311142',keypairs=,launch_index=0,launched_at=2026-05-07T17:34:24Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='d215b3694bec42629ea172bb6d560623',ramdisk_id='',reservation_id='r-okh73x6f',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='fe5635da-867a-46a8-bf75-e6ea9504035d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_scsi_model='virtio-scsi',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1466334568',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1466334568-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2026-05-07T17:34:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6175af70f2de4ecdb264e34fbc50978b',uuid=2f733e5d-791b-4963-a6f4-4e64ea8da505,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "address": "fa:16:3e:73:73:4f", "network": {"id": "72c7f05a-3795-4a85-bb7b-f3068e30f292", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1642257360-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d215b3694bec42629ea172bb6d560623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6f7733-b6", "ovs_interfaceid": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:881}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Converting VIF {"id": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "address": "fa:16:3e:73:73:4f", "network": {"id": "72c7f05a-3795-4a85-bb7b-f3068e30f292", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1642257360-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d215b3694bec42629ea172bb6d560623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f6f7733-b6", "ovs_interfaceid": "6f6f7733-b69d-4c99-94fd-93b769a37ed8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:73:73:4f,bridge_name='br-int',has_traffic_filtering=True,id=6f6f7733-b69d-4c99-94fd-93b769a37ed8,network=Network(72c7f05a-3795-4a85-bb7b-f3068e30f292),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f6f7733-b6') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG os_vif [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:73:73:4f,bridge_name='br-int',has_traffic_filtering=True,id=6f6f7733-b69d-4c99-94fd-93b769a37ed8,network=Network(72c7f05a-3795-4a85-bb7b-f3068e30f292),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f6f7733-b6') {{(pid=86443) unplug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:109}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6f6f7733-b6, bridge=br-int, if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e1da426e-6119-4610-83d9-b59f3e5bb1b1) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:09 devstack nova-compute[86443]: INFO os_vif [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:73:73:4f,bridge_name='br-int',has_traffic_filtering=True,id=6f6f7733-b69d-4c99-94fd-93b769a37ed8,network=Network(72c7f05a-3795-4a85-bb7b-f3068e30f292),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f6f7733-b6') Mai 07 19:35:09 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Deleting instance files /opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505_del Mai 07 19:35:09 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Deletion of /opt/stack/data/nova/instances/2f733e5d-791b-4963-a6f4-4e64ea8da505_del complete Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-eaebd9f4-77d6-45b1-a95a-db6b5c34369d req-3e10b699-c5ad-45d1-a7c3-81f65ee13267 service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Received event network-vif-unplugged-6f6f7733-b69d-4c99-94fd-93b769a37ed8 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-eaebd9f4-77d6-45b1-a95a-db6b5c34369d req-3e10b699-c5ad-45d1-a7c3-81f65ee13267 service nova] Acquiring lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-eaebd9f4-77d6-45b1-a95a-db6b5c34369d req-3e10b699-c5ad-45d1-a7c3-81f65ee13267 service nova] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-eaebd9f4-77d6-45b1-a95a-db6b5c34369d req-3e10b699-c5ad-45d1-a7c3-81f65ee13267 service nova] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-eaebd9f4-77d6-45b1-a95a-db6b5c34369d req-3e10b699-c5ad-45d1-a7c3-81f65ee13267 service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] No waiting events found dispatching network-vif-unplugged-6f6f7733-b69d-4c99-94fd-93b769a37ed8 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-eaebd9f4-77d6-45b1-a95a-db6b5c34369d req-3e10b699-c5ad-45d1-a7c3-81f65ee13267 service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Received event network-vif-unplugged-6f6f7733-b69d-4c99-94fd-93b769a37ed8 for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='',created_at=2026-05-07T17:33:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-273934295',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumeshelvetestjson-server-273934295',id=4,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN0qxKGECoNEbYjRYyg9J6eBpQKjyUPTn/Y7GjZ2OQpxNDk8Jn4i4KnODTp8uyJYXXcinrgx2Ov8HE7RRLhSdzJb3J8GVC/uK4BHogxKpbjjNPR4YlXryX9vIW+FBnT0og==',key_name='tempest-keypair-1661872501',keypairs=,launch_index=0,launched_at=2026-05-07T17:34:44Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='c334113f19ec44068e5f9d6c5b26596d',ramdisk_id='',reservation_id='r-3921qkt1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-60761633',owner_user_name='tempest-AttachVolumeShelveTestJSON-60761633-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2026-05-07T17:34:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93c67b1b99f94c92bc013de9e4ea5a50',uuid=462f0fe5-72f4-445d-8e8d-b9fd3a9c0735,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.68", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:881}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Converting VIF {"id": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "address": "fa:16:3e:eb:e1:f1", "network": {"id": "fd6aab17-0e41-43d3-8659-62f185eed00e", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-782775284-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.68", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c334113f19ec44068e5f9d6c5b26596d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46fa4b-1c", "ovs_interfaceid": "6c46fa4b-1cd3-4216-b294-254a49b6191c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:e1:f1,bridge_name='br-int',has_traffic_filtering=True,id=6c46fa4b-1cd3-4216-b294-254a49b6191c,network=Network(fd6aab17-0e41-43d3-8659-62f185eed00e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46fa4b-1c') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG os_vif [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:e1:f1,bridge_name='br-int',has_traffic_filtering=True,id=6c46fa4b-1cd3-4216-b294-254a49b6191c,network=Network(fd6aab17-0e41-43d3-8659-62f185eed00e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46fa4b-1c') {{(pid=86443) unplug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:109}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c46fa4b-1c, bridge=br-int, if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=97ee5f69-7284-4af2-984c-9aec171e4c3e) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:35:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:09 devstack nova-compute[86443]: INFO os_vif [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:e1:f1,bridge_name='br-int',has_traffic_filtering=True,id=6c46fa4b-1cd3-4216-b294-254a49b6191c,network=Network(fd6aab17-0e41-43d3-8659-62f185eed00e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46fa4b-1c') Mai 07 19:35:09 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Deleting instance files /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735_del Mai 07 19:35:09 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Deletion of /opt/stack/data/nova/instances/462f0fe5-72f4-445d-8e8d-b9fd3a9c0735_del complete Mai 07 19:35:10 devstack nova-compute[86443]: INFO nova.compute.manager [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Took 1.47 seconds to destroy the instance on the hypervisor. Mai 07 19:35:10 devstack nova-compute[86443]: DEBUG oslo.service.backend._common.loopingcall [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=86443) func /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_common/loopingcall.py:419}} Mai 07 19:35:10 devstack nova-compute[86443]: DEBUG nova.compute.manager [-] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Deallocating network for instance {{(pid=86443) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2456}} Mai 07 19:35:10 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] deallocate_for_instance() {{(pid=86443) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1842}} Mai 07 19:35:10 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:35:10 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:35:10 devstack nova-compute[86443]: INFO nova.compute.manager [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Took 1.43 seconds to destroy the instance on the hypervisor. Mai 07 19:35:10 devstack nova-compute[86443]: DEBUG oslo.service.backend._common.loopingcall [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=86443) func /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_common/loopingcall.py:419}} Mai 07 19:35:10 devstack nova-compute[86443]: DEBUG nova.compute.manager [-] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Deallocating network for instance {{(pid=86443) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2456}} Mai 07 19:35:10 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] deallocate_for_instance() {{(pid=86443) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1842}} Mai 07 19:35:10 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:35:10 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Received event network-vif-unplugged-6f6f7733-b69d-4c99-94fd-93b769a37ed8 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] Acquiring lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] No waiting events found dispatching network-vif-unplugged-6f6f7733-b69d-4c99-94fd-93b769a37ed8 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Received event network-vif-unplugged-6f6f7733-b69d-4c99-94fd-93b769a37ed8 for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received event network-vif-unplugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] Acquiring lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] No waiting events found dispatching network-vif-unplugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received event network-vif-unplugged-6c46fa4b-1cd3-4216-b294-254a49b6191c for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received event network-vif-unplugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] Acquiring lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] No waiting events found dispatching network-vif-unplugged-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:35:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-34e996da-a558-4c6e-9d34-5d1c68726e19 req-35d7ae12-539d-48df-8173-d5192d09d814 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received event network-vif-unplugged-6c46fa4b-1cd3-4216-b294-254a49b6191c for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:35:12 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-a13b9009-5e77-4132-8e62-538a81101f45 req-e9f87141-4fdd-48fe-8dd0-b3960b4861c9 service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Received event network-vif-deleted-6f6f7733-b69d-4c99-94fd-93b769a37ed8 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:35:12 devstack nova-compute[86443]: INFO nova.compute.manager [req-a13b9009-5e77-4132-8e62-538a81101f45 req-e9f87141-4fdd-48fe-8dd0-b3960b4861c9 service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Neutron deleted interface 6f6f7733-b69d-4c99-94fd-93b769a37ed8; detaching it from the instance and deleting it from the info cache Mai 07 19:35:12 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-a13b9009-5e77-4132-8e62-538a81101f45 req-e9f87141-4fdd-48fe-8dd0-b3960b4861c9 service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:35:12 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:35:12 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:35:12 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Took 2.52 seconds to deallocate network for instance. Mai 07 19:35:12 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-a13b9009-5e77-4132-8e62-538a81101f45 req-e9f87141-4fdd-48fe-8dd0-b3960b4861c9 service nova] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Detach interface failed, port_id=6f6f7733-b69d-4c99-94fd-93b769a37ed8, reason: Instance 2f733e5d-791b-4963-a6f4-4e64ea8da505 could not be found. {{(pid=86443) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:11820}} Mai 07 19:35:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:12 devstack nova-compute[86443]: INFO nova.compute.manager [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Terminating instance Mai 07 19:35:13 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Took 2.89 seconds to deallocate network for instance. Mai 07 19:35:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:13 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Start destroying the instance on the hypervisor. {{(pid=86443) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3332}} Mai 07 19:35:13 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:35:13 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:35:13 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:13 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:13 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:13 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:13 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:13 devstack nova-compute[86443]: DEBUG nova.utils [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Queued Task(fn=>, remaining_delay=14.999376737000148 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:35:13 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Instance destroyed successfully. Mai 07 19:35:13 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lazy-loading 'resources' on Instance uuid fce02c77-7e4b-4107-bca0-3e4453fd67f8 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:35:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:13 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-5a9b9613-8c11-4019-bd07-072fa5307354 req-905e117a-30be-4cfe-9f79-4e5f937f2881 service nova] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Received event network-vif-deleted-6c46fa4b-1cd3-4216-b294-254a49b6191c {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-3b5696a1-61c2-4bad-a8ef-6dd08f21f1ff req-17d263a4-f9bc-4c39-93a9-98da4d760a98 service nova] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Received event network-vif-unplugged-d2b17082-6210-48f7-a4ec-b177a18587ac {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-3b5696a1-61c2-4bad-a8ef-6dd08f21f1ff req-17d263a4-f9bc-4c39-93a9-98da4d760a98 service nova] Acquiring lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-3b5696a1-61c2-4bad-a8ef-6dd08f21f1ff req-17d263a4-f9bc-4c39-93a9-98da4d760a98 service nova] Lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-3b5696a1-61c2-4bad-a8ef-6dd08f21f1ff req-17d263a4-f9bc-4c39-93a9-98da4d760a98 service nova] Lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-3b5696a1-61c2-4bad-a8ef-6dd08f21f1ff req-17d263a4-f9bc-4c39-93a9-98da4d760a98 service nova] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] No waiting events found dispatching network-vif-unplugged-d2b17082-6210-48f7-a4ec-b177a18587ac {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-3b5696a1-61c2-4bad-a8ef-6dd08f21f1ff req-17d263a4-f9bc-4c39-93a9-98da4d760a98 service nova] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Received event network-vif-unplugged-d2b17082-6210-48f7-a4ec-b177a18587ac for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:34:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-689812090',display_name='tempest-ServersNegativeTestJSON-server-689812090',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-serversnegativetestjson-server-689812090',id=6,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-05-07T17:34:27Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='f9f94b6740934688aeea63198166dda4',ramdisk_id='',reservation_id='r-ptn0rtzn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-1436591115',owner_user_name='tempest-ServersNegativeTestJSON-1436591115-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2026-05-07T17:34:27Z,user_data=None,user_id='e7648a0472014bc2be2325afb69590b9',uuid=fce02c77-7e4b-4107-bca0-3e4453fd67f8,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d2b17082-6210-48f7-a4ec-b177a18587ac", "address": "fa:16:3e:b2:99:94", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2b17082-62", "ovs_interfaceid": "d2b17082-6210-48f7-a4ec-b177a18587ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:881}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Converting VIF {"id": "d2b17082-6210-48f7-a4ec-b177a18587ac", "address": "fa:16:3e:b2:99:94", "network": {"id": "d7a1572e-15cd-4740-b77a-a3a68ea38663", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1236773575-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f9f94b6740934688aeea63198166dda4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2b17082-62", "ovs_interfaceid": "d2b17082-6210-48f7-a4ec-b177a18587ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:99:94,bridge_name='br-int',has_traffic_filtering=True,id=d2b17082-6210-48f7-a4ec-b177a18587ac,network=Network(d7a1572e-15cd-4740-b77a-a3a68ea38663),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2b17082-62') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG os_vif [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:99:94,bridge_name='br-int',has_traffic_filtering=True,id=d2b17082-6210-48f7-a4ec-b177a18587ac,network=Network(d7a1572e-15cd-4740-b77a-a3a68ea38663),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2b17082-62') {{(pid=86443) unplug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:109}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2b17082-62, bridge=br-int, if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=4ab8a09a-513d-4407-9f6d-9bf0d67b2e72) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:14 devstack nova-compute[86443]: INFO os_vif [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:99:94,bridge_name='br-int',has_traffic_filtering=True,id=d2b17082-6210-48f7-a4ec-b177a18587ac,network=Network(d7a1572e-15cd-4740-b77a-a3a68ea38663),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2b17082-62') Mai 07 19:35:14 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Deleting instance files /opt/stack/data/nova/instances/fce02c77-7e4b-4107-bca0-3e4453fd67f8_del Mai 07 19:35:14 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Deletion of /opt/stack/data/nova/instances/fce02c77-7e4b-4107-bca0-3e4453fd67f8_del complete Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.137s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.663s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:35:14 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:35:14 devstack nova-compute[86443]: INFO nova.scheduler.client.report [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Deleted allocations for instance 2f733e5d-791b-4963-a6f4-4e64ea8da505 Mai 07 19:35:15 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:35:15 devstack nova-compute[86443]: INFO nova.compute.manager [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Took 1.83 seconds to destroy the instance on the hypervisor. Mai 07 19:35:15 devstack nova-compute[86443]: DEBUG oslo.service.backend._common.loopingcall [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=86443) func /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_common/loopingcall.py:419}} Mai 07 19:35:15 devstack nova-compute[86443]: DEBUG nova.compute.manager [-] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Deallocating network for instance {{(pid=86443) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2456}} Mai 07 19:35:15 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] deallocate_for_instance() {{(pid=86443) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1842}} Mai 07 19:35:15 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:35:15 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:35:15 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.134s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:15 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ee03182f-0f8e-4baa-b4fe-8c8318e055ed tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Lock "2f733e5d-791b-4963-a6f4-4e64ea8da505" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 7.627s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:15 devstack nova-compute[86443]: INFO nova.scheduler.client.report [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Deleted allocations for instance 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735 Mai 07 19:35:16 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-08d28785-5447-4e59-a9c2-47a6d3ae6b20 req-4fb48a86-d7ec-42b3-9c07-7865c3c66338 service nova] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Received event network-vif-unplugged-d2b17082-6210-48f7-a4ec-b177a18587ac {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:35:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-08d28785-5447-4e59-a9c2-47a6d3ae6b20 req-4fb48a86-d7ec-42b3-9c07-7865c3c66338 service nova] Acquiring lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-08d28785-5447-4e59-a9c2-47a6d3ae6b20 req-4fb48a86-d7ec-42b3-9c07-7865c3c66338 service nova] Lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-08d28785-5447-4e59-a9c2-47a6d3ae6b20 req-4fb48a86-d7ec-42b3-9c07-7865c3c66338 service nova] Lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:16 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-08d28785-5447-4e59-a9c2-47a6d3ae6b20 req-4fb48a86-d7ec-42b3-9c07-7865c3c66338 service nova] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] No waiting events found dispatching network-vif-unplugged-d2b17082-6210-48f7-a4ec-b177a18587ac {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:35:16 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-08d28785-5447-4e59-a9c2-47a6d3ae6b20 req-4fb48a86-d7ec-42b3-9c07-7865c3c66338 service nova] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Received event network-vif-unplugged-d2b17082-6210-48f7-a4ec-b177a18587ac for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:35:17 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:35:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-9b3efb66-6bf0-4012-b0be-0635c08074bd tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Lock "462f0fe5-72f4-445d-8e8d-b9fd3a9c0735" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 8.989s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:17 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Took 2.36 seconds to deallocate network for instance. Mai 07 19:35:18 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-2928a574-be6c-46e2-8a14-c2e93622c612 req-840356a1-ee23-4edf-8a3c-14d3329ea081 service nova] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Received event network-vif-deleted-d2b17082-6210-48f7-a4ec-b177a18587ac {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:35:19 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:20 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:20 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:20 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:35:20 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:35:21 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:35:21 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.149s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:21 devstack nova-compute[86443]: INFO nova.scheduler.client.report [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Deleted allocations for instance fce02c77-7e4b-4107-bca0-3e4453fd67f8 Mai 07 19:35:22 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-82024d3b-b7b2-4755-a25a-fdea4ce5b393 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Lock "fce02c77-7e4b-4107-bca0-3e4453fd67f8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 10.040s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:24 devstack nova-compute[86443]: DEBUG nova.utils [-] Task(fn=>, remaining_delay=-0.0009319219998360495 future=) submitted to {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:35:24 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] Emitting event Stopped> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:35:24 devstack nova-compute[86443]: INFO nova.compute.manager [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] VM Stopped (Lifecycle Event) Mai 07 19:35:24 devstack nova-compute[86443]: DEBUG nova.utils [-] Waiting for the next task {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:35:24 devstack nova-compute[86443]: DEBUG nova.utils [-] Received Task(fn=>, remaining_delay=0.23488289199985957 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:35:24 devstack nova-compute[86443]: DEBUG nova.utils [-] Waitig for the deadline of Task(fn=>, remaining_delay=0.23387438500003555 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:35:24 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:24 devstack nova-compute[86443]: DEBUG nova.utils [-] Task(fn=>, remaining_delay=-0.0013194060002206243 future=) submitted to {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:35:24 devstack nova-compute[86443]: DEBUG nova.utils [-] Waiting for the next task {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:35:24 devstack nova-compute[86443]: DEBUG nova.utils [-] Received Task(fn=>, remaining_delay=4.380300368999997 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:35:24 devstack nova-compute[86443]: DEBUG nova.utils [-] Waitig for the deadline of Task(fn=>, remaining_delay=4.379884393999873 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:35:24 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-79bb32b8-9c1b-4ab6-8756-2941ab937166 None None] Emitting event Stopped> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:35:24 devstack nova-compute[86443]: INFO nova.compute.manager [None req-79bb32b8-9c1b-4ab6-8756-2941ab937166 None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] VM Stopped (Lifecycle Event) Mai 07 19:35:24 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-3696aebd-502d-4184-82c0-56e1b93fa7ed tempest-DeleteServersTestJSON-416667827 tempest-DeleteServersTestJSON-416667827-project-member] [instance: 2f733e5d-791b-4963-a6f4-4e64ea8da505] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:35:24 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-79bb32b8-9c1b-4ab6-8756-2941ab937166 None None] [instance: 462f0fe5-72f4-445d-8e8d-b9fd3a9c0735] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:35:28 devstack nova-compute[86443]: DEBUG nova.utils [-] Task(fn=>, remaining_delay=-0.007133934999728808 future=) submitted to {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:35:28 devstack nova-compute[86443]: DEBUG nova.utils [-] Waiting for the next task {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:35:28 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Emitting event Stopped> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:35:28 devstack nova-compute[86443]: INFO nova.compute.manager [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] VM Stopped (Lifecycle Event) Mai 07 19:35:29 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: fce02c77-7e4b-4107-bca0-3e4453fd67f8] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:35:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:31 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:31 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:34 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:36 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:39 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:40 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:35:40 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Cleaning up deleted instances {{(pid=86443) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:12083}} Mai 07 19:35:40 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] There are 0 instances to clean {{(pid=86443) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:12092}} Mai 07 19:35:40 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:35:40 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Cleaning up deleted instances with incomplete migration {{(pid=86443) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:12121}} Mai 07 19:35:41 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:35:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:41 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 3.00 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:35:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:44 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:44 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:35:44 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:35:44 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:35:44 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:35:44 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:35:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=86443) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:11402}} Mai 07 19:35:44 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 0.47 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:35:45 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:45 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:35:45 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 0.99 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:35:46 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager.update_available_resource {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:35:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:46 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Auditing locally available compute resources for devstack (node: devstack) {{(pid=86443) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:937}} Mai 07 19:35:46 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:35:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): env LANG=C uptime {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:35:46 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "env LANG=C uptime" returned: 0 in 0.062s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:35:46 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Hypervisor/Node resource view: name=devstack free_ram=5391MB free_disk=14.760883331298828GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_00_0", "address": "0000:02:00.0", "product_id": "000d", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000d", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1111", "vendor_id": "1234", "numa_node": null, "label": "label_1234_1111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1043", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1043", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] {{(pid=86443) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1136}} Mai 07 19:35:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:48 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Total usable vcpus: 4, total allocated vcpus: 0 {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1159}} Mai 07 19:35:48 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Final resource view: name=devstack phys_ram=11961MB used_ram=512MB phys_disk=25GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:35:46 up 35 min, 1 user, load average: 7.50, 5.32, 3.19\n'} {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1168}} Mai 07 19:35:48 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Refreshing inventories for resource provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:822}} Mai 07 19:35:48 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Updating ProviderTree inventory for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:786}} Mai 07 19:35:48 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Updating inventory in ProviderTree for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Mai 07 19:35:48 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Refreshing aggregate associations for resource provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944, aggregates: None {{(pid=86443) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:831}} Mai 07 19:35:48 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Refreshing trait associations for resource provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944, traits: HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_MMX,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VMVGA,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_F16C,HW_CPU_X86_SSE2,HW_CPU_X86_BMI,COMPUTE_SOUND_MODEL_USB,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_ABM,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_QXL,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_SOUND_MODEL_AC97,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE,HW_ARCH_X86_64,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE {{(pid=86443) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:843}} Mai 07 19:35:48 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:35:48 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:35:49 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:35:49 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:49 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Compute_service record updated for devstack:devstack {{(pid=86443) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1097}} Mai 07 19:35:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.803s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:49 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 1.00 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:35:50 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:35:50 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 52.54 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:35:51 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:51 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "21c9b24e-fc56-4f69-8903-64182d970d61" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:53 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Starting instance... {{(pid=86443) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2605}} Mai 07 19:35:54 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:54 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:54 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=86443) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2630}} Mai 07 19:35:54 devstack nova-compute[86443]: INFO nova.compute.claims [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Claim successful on node devstack Mai 07 19:35:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:55 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:35:55 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:35:55 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:35:56 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.165s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:56 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Start building networks asynchronously for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3003}} Mai 07 19:35:56 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:56 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "0a37052d-f36d-4e75-8464-1c227b87d6e5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:56 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:56 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Allocating IP information in the background. {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2148}} Mai 07 19:35:56 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] allocate_for_instance() {{(pid=86443) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1187}} Mai 07 19:35:56 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:35:56 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:35:56 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Starting instance... {{(pid=86443) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2605}} Mai 07 19:35:57 devstack nova-compute[86443]: DEBUG nova.policy [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd0b98555d63743c09b6d040a46f0ecdc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5de43333e69940d19777adfbbc96190f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=86443) authorize /opt/stack/nova/nova/policy.py:192}} Mai 07 19:35:57 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Mai 07 19:35:57 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:57 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:57 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=86443) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2630}} Mai 07 19:35:57 devstack nova-compute[86443]: INFO nova.compute.claims [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Claim successful on node devstack Mai 07 19:35:57 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Start building block device mappings for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3038}} Mai 07 19:35:58 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Successfully created port: d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 {{(pid=86443) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:529}} Mai 07 19:35:58 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:35:58 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:35:58 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Start spawning the instance on the hypervisor. {{(pid=86443) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2811}} Mai 07 19:35:58 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Creating instance directory {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5215}} Mai 07 19:35:58 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Creating image(s) Mai 07 19:35:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "/opt/stack/data/nova/instances/21c9b24e-fc56-4f69-8903-64182d970d61/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "/opt/stack/data/nova/instances/21c9b24e-fc56-4f69-8903-64182d970d61/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "/opt/stack/data/nova/instances/21c9b24e-fc56-4f69-8903-64182d970d61/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:58 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:35:58 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:35:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.135s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "d8d56ca44922efe85609619d01052c20f44c056a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Successfully updated port: d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 {{(pid=86443) _update_port /opt/stack/nova/nova/network/neutron.py:567}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.195s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/21c9b24e-fc56-4f69-8903-64182d970d61/disk 1073741824 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/21c9b24e-fc56-4f69-8903-64182d970d61/disk 1073741824" returned: 0 in 0.070s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.317s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-e5fa49d6-6d57-4295-99f2-f88eab7b5a12 req-ca9f23be-19d1-4f3a-8b12-8a3b252371ee service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Received event network-changed-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-e5fa49d6-6d57-4295-99f2-f88eab7b5a12 req-ca9f23be-19d1-4f3a-8b12-8a3b252371ee service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Refreshing instance network info cache due to event network-changed-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-e5fa49d6-6d57-4295-99f2-f88eab7b5a12 req-ca9f23be-19d1-4f3a-8b12-8a3b252371ee service nova] Acquiring lock "refresh_cache-21c9b24e-fc56-4f69-8903-64182d970d61" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-e5fa49d6-6d57-4295-99f2-f88eab7b5a12 req-ca9f23be-19d1-4f3a-8b12-8a3b252371ee service nova] Acquired lock "refresh_cache-21c9b24e-fc56-4f69-8903-64182d970d61" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-e5fa49d6-6d57-4295-99f2-f88eab7b5a12 req-ca9f23be-19d1-4f3a-8b12-8a3b252371ee service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Refreshing network info cache for port d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.182s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Checking if we can resize image /opt/stack/data/nova/instances/21c9b24e-fc56-4f69-8903-64182d970d61/disk. size=1073741824 {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:178}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/21c9b24e-fc56-4f69-8903-64182d970d61/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/21c9b24e-fc56-4f69-8903-64182d970d61/disk --force-share --output=json" returned: 0 in 0.160s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Cannot resize image /opt/stack/data/nova/instances/21c9b24e-fc56-4f69-8903-64182d970d61/disk to a smaller size. {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:184}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Created local disks {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5347}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Ensure instance console log exists: /opt/stack/data/nova/instances/21c9b24e-fc56-4f69-8903-64182d970d61/console.log {{(pid=86443) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:5094}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "refresh_cache-21c9b24e-fc56-4f69-8903-64182d970d61" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.270s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:35:59 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Start building networks asynchronously for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3003}} Mai 07 19:35:59 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-e5fa49d6-6d57-4295-99f2-f88eab7b5a12 req-ca9f23be-19d1-4f3a-8b12-8a3b252371ee service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:36:00 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-e5fa49d6-6d57-4295-99f2-f88eab7b5a12 req-ca9f23be-19d1-4f3a-8b12-8a3b252371ee service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:36:00 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Allocating IP information in the background. {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2148}} Mai 07 19:36:00 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] allocate_for_instance() {{(pid=86443) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1187}} Mai 07 19:36:00 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:36:00 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:36:00 devstack nova-compute[86443]: DEBUG nova.policy [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd98e17081ef14e01bf76138813a4d56a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1ea2fed9f654419a4de1a6168d279ab', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=86443) authorize /opt/stack/nova/nova/policy.py:192}} Mai 07 19:36:00 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-e5fa49d6-6d57-4295-99f2-f88eab7b5a12 req-ca9f23be-19d1-4f3a-8b12-8a3b252371ee service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:36:00 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Mai 07 19:36:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-e5fa49d6-6d57-4295-99f2-f88eab7b5a12 req-ca9f23be-19d1-4f3a-8b12-8a3b252371ee service nova] Releasing lock "refresh_cache-21c9b24e-fc56-4f69-8903-64182d970d61" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:36:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquired lock "refresh_cache-21c9b24e-fc56-4f69-8903-64182d970d61" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:36:01 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:36:01 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Start building block device mappings for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3038}} Mai 07 19:36:01 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:01 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Successfully created port: cf792953-3429-4182-ba5e-0b8d056bc7e9 {{(pid=86443) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:529}} Mai 07 19:36:01 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Start spawning the instance on the hypervisor. {{(pid=86443) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2811}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Creating instance directory {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5215}} Mai 07 19:36:02 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Creating image(s) Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "/opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "/opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "/opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.141s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "d8d56ca44922efe85609619d01052c20f44c056a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.155s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/disk 1073741824 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/disk 1073741824" returned: 0 in 0.091s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.259s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.176s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:36:03 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Checking if we can resize image /opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/disk. size=1073741824 {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:178}} Mai 07 19:36:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:36:03 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Cannot resize image /opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/disk to a smaller size. {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:184}} Mai 07 19:36:03 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Created local disks {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5347}} Mai 07 19:36:03 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Ensure instance console log exists: /opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/console.log {{(pid=86443) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:5094}} Mai 07 19:36:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:03 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:04 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:36:04 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Successfully updated port: cf792953-3429-4182-ba5e-0b8d056bc7e9 {{(pid=86443) _update_port /opt/stack/nova/nova/network/neutron.py:567}} Mai 07 19:36:04 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-8bc0d677-83b4-4b77-a7ae-d4cbfd9de5ce req-2e34934e-2a1e-4ed4-bfe5-f07f16a28c13 service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Received event network-changed-cf792953-3429-4182-ba5e-0b8d056bc7e9 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:36:04 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-8bc0d677-83b4-4b77-a7ae-d4cbfd9de5ce req-2e34934e-2a1e-4ed4-bfe5-f07f16a28c13 service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Refreshing instance network info cache due to event network-changed-cf792953-3429-4182-ba5e-0b8d056bc7e9. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:36:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-8bc0d677-83b4-4b77-a7ae-d4cbfd9de5ce req-2e34934e-2a1e-4ed4-bfe5-f07f16a28c13 service nova] Acquiring lock "refresh_cache-0a37052d-f36d-4e75-8464-1c227b87d6e5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:36:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-8bc0d677-83b4-4b77-a7ae-d4cbfd9de5ce req-2e34934e-2a1e-4ed4-bfe5-f07f16a28c13 service nova] Acquired lock "refresh_cache-0a37052d-f36d-4e75-8464-1c227b87d6e5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:36:04 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-8bc0d677-83b4-4b77-a7ae-d4cbfd9de5ce req-2e34934e-2a1e-4ed4-bfe5-f07f16a28c13 service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Refreshing network info cache for port cf792953-3429-4182-ba5e-0b8d056bc7e9 {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:36:04 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Updating instance_info_cache with network_info: [{"id": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "address": "fa:16:3e:0c:05:04", "network": {"id": "70b0bd16-afc8-4311-9822-4d1e650336f4", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2112651691-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5de43333e69940d19777adfbbc96190f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1e26fe2-82", "ovs_interfaceid": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:36:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "refresh_cache-0a37052d-f36d-4e75-8464-1c227b87d6e5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:36:05 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-8bc0d677-83b4-4b77-a7ae-d4cbfd9de5ce req-2e34934e-2a1e-4ed4-bfe5-f07f16a28c13 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-8bc0d677-83b4-4b77-a7ae-d4cbfd9de5ce req-2e34934e-2a1e-4ed4-bfe5-f07f16a28c13 service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Releasing lock "refresh_cache-21c9b24e-fc56-4f69-8903-64182d970d61" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Instance network_info: |[{"id": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "address": "fa:16:3e:0c:05:04", "network": {"id": "70b0bd16-afc8-4311-9822-4d1e650336f4", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2112651691-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5de43333e69940d19777adfbbc96190f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1e26fe2-82", "ovs_interfaceid": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2163}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Start _get_guest_xml network_info=[{"id": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "address": "fa:16:3e:0c:05:04", "network": {"id": "70b0bd16-afc8-4311-9822-4d1e650336f4", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2112651691-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5de43333e69940d19777adfbbc96190f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1e26fe2-82", "ovs_interfaceid": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'image_id': 'e8633b10-b98a-4580-90f8-3091ca40fa29'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8192}} Mai 07 19:36:05 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='e8633b10-b98a-4580-90f8-3091ca40fa29', instance_meta=NovaInstanceMeta(name='tempest-AttachVolumeTestJSON-server-188183399', uuid='21c9b24e-fc56-4f69-8903-64182d970d61'), owner=OwnerMeta(userid='d0b98555d63743c09b6d040a46f0ecdc', username='tempest-AttachVolumeTestJSON-883651288-project-member', projectid='5de43333e69940d19777adfbbc96190f', projectname='tempest-AttachVolumeTestJSON-883651288'), image=ImageMeta(id='e8633b10-b98a-4580-90f8-3091ca40fa29', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "address": "fa:16:3e:0c:05:04", "network": {"id": "70b0bd16-afc8-4311-9822-4d1e650336f4", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2112651691-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5de43333e69940d19777adfbbc96190f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1e26fe2-82", "ovs_interfaceid": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175365.2914255) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Searching host: 'devstack' for CPU controller through CGroups V1... {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1783}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] CPU controller missing on host. {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1793}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Searching host: 'devstack' for CPU controller through CGroups V2... {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1802}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] CPU controller found on host. {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1809}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Getting desirable topologies for flavor Flavor(created_at=2026-05-07T17:26:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=192,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=), allow threads: True {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:703}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Flavor limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:488}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Image limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:492}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Flavor pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:528}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Image pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:532}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:570}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:709}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:611}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Got 1 possible topologies {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:641}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:715}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:717}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:35:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-188183399',display_name='tempest-AttachVolumeTestJSON-server-188183399',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumetestjson-server-188183399',id=8,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN8OXI61bvF6LpRgL4kB2SDd/pZ30L3qMQl3w6m3DhKbattCPpy9zT8lnp0BIdJjUt5MpeDQUs20HqqKtjuO1gnX4HzRErmxbta8CaW+KlLMe5mn0Gu9ZHEsPebf530uPw==',key_name='tempest-keypair-1557371974',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5de43333e69940d19777adfbbc96190f',ramdisk_id='',reservation_id='r-w3anivb0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-883651288',owner_user_name='tempest-AttachVolumeTestJSON-883651288-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:35:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0b98555d63743c09b6d040a46f0ecdc',uuid=21c9b24e-fc56-4f69-8903-64182d970d61,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "address": "fa:16:3e:0c:05:04", "network": {"id": "70b0bd16-afc8-4311-9822-4d1e650336f4", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2112651691-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5de43333e69940d19777adfbbc96190f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1e26fe2-82", "ovs_interfaceid": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Converting VIF {"id": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "address": "fa:16:3e:0c:05:04", "network": {"id": "70b0bd16-afc8-4311-9822-4d1e650336f4", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2112651691-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5de43333e69940d19777adfbbc96190f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1e26fe2-82", "ovs_interfaceid": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:05:04,bridge_name='br-int',has_traffic_filtering=True,id=d1e26fe2-8285-4e9b-9bcc-7acb4ff87129,network=Network(70b0bd16-afc8-4311-9822-4d1e650336f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1e26fe2-82') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lazy-loading 'pci_devices' on Instance uuid 21c9b24e-fc56-4f69-8903-64182d970d61 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-8bc0d677-83b4-4b77-a7ae-d4cbfd9de5ce req-2e34934e-2a1e-4ed4-bfe5-f07f16a28c13 service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] End _get_guest_xml xml= Mai 07 19:36:05 devstack nova-compute[86443]: 21c9b24e-fc56-4f69-8903-64182d970d61 Mai 07 19:36:05 devstack nova-compute[86443]: instance-00000008 Mai 07 19:36:05 devstack nova-compute[86443]: 196608 Mai 07 19:36:05 devstack nova-compute[86443]: 1 Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: tempest-AttachVolumeTestJSON-server-188183399 Mai 07 19:36:05 devstack nova-compute[86443]: 2026-05-07 17:36:05 Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: 192 Mai 07 19:36:05 devstack nova-compute[86443]: 1 Mai 07 19:36:05 devstack nova-compute[86443]: 0 Mai 07 19:36:05 devstack nova-compute[86443]: 0 Mai 07 19:36:05 devstack nova-compute[86443]: 1 Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: True Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: bare Mai 07 19:36:05 devstack nova-compute[86443]: qcow2 Mai 07 19:36:05 devstack nova-compute[86443]: 1 Mai 07 19:36:05 devstack nova-compute[86443]: 0 Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: virtio Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: tempest-AttachVolumeTestJSON-883651288-project-member Mai 07 19:36:05 devstack nova-compute[86443]: tempest-AttachVolumeTestJSON-883651288 Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: OpenStack Foundation Mai 07 19:36:05 devstack nova-compute[86443]: OpenStack Nova Mai 07 19:36:05 devstack nova-compute[86443]: 33.1.0 Mai 07 19:36:05 devstack nova-compute[86443]: 21c9b24e-fc56-4f69-8903-64182d970d61 Mai 07 19:36:05 devstack nova-compute[86443]: 21c9b24e-fc56-4f69-8903-64182d970d61 Mai 07 19:36:05 devstack nova-compute[86443]: Virtual Machine Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: hvm Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: 1 Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: /dev/urandom Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: Mai 07 19:36:05 devstack nova-compute[86443]: {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8199}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Preparing to wait for external event network-vif-plugged-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 {{(pid=86443) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:306}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:35:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-188183399',display_name='tempest-AttachVolumeTestJSON-server-188183399',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumetestjson-server-188183399',id=8,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN8OXI61bvF6LpRgL4kB2SDd/pZ30L3qMQl3w6m3DhKbattCPpy9zT8lnp0BIdJjUt5MpeDQUs20HqqKtjuO1gnX4HzRErmxbta8CaW+KlLMe5mn0Gu9ZHEsPebf530uPw==',key_name='tempest-keypair-1557371974',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5de43333e69940d19777adfbbc96190f',ramdisk_id='',reservation_id='r-w3anivb0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-883651288',owner_user_name='tempest-AttachVolumeTestJSON-883651288-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:35:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0b98555d63743c09b6d040a46f0ecdc',uuid=21c9b24e-fc56-4f69-8903-64182d970d61,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "address": "fa:16:3e:0c:05:04", "network": {"id": "70b0bd16-afc8-4311-9822-4d1e650336f4", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2112651691-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5de43333e69940d19777adfbbc96190f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1e26fe2-82", "ovs_interfaceid": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) plug /opt/stack/nova/nova/virt/libvirt/vif.py:763}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Converting VIF {"id": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "address": "fa:16:3e:0c:05:04", "network": {"id": "70b0bd16-afc8-4311-9822-4d1e650336f4", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2112651691-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5de43333e69940d19777adfbbc96190f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1e26fe2-82", "ovs_interfaceid": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:05:04,bridge_name='br-int',has_traffic_filtering=True,id=d1e26fe2-8285-4e9b-9bcc-7acb4ff87129,network=Network(70b0bd16-afc8-4311-9822-4d1e650336f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1e26fe2-82') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG os_vif [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:05:04,bridge_name='br-int',has_traffic_filtering=True,id=d1e26fe2-8285-4e9b-9bcc-7acb4ff87129,network=Network(70b0bd16-afc8-4311-9822-4d1e650336f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1e26fe2-82') {{(pid=86443) plug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:76}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '0a06b28d-6739-57c7-a00f-92a4b06402e8', '_type': 'linux-noop'}}, row=False) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1e26fe2-82, may_exist=True, interface_attrs={}) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapd1e26fe2-82, col_values=(('qos', UUID('cd656662-458a-46bf-b91b-dc2637033667')),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapd1e26fe2-82, col_values=(('external_ids', {'iface-id': 'd1e26fe2-8285-4e9b-9bcc-7acb4ff87129', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0c:05:04', 'vm-uuid': '21c9b24e-fc56-4f69-8903-64182d970d61'}),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:36:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:36:05 devstack nova-compute[86443]: INFO os_vif [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:05:04,bridge_name='br-int',has_traffic_filtering=True,id=d1e26fe2-8285-4e9b-9bcc-7acb4ff87129,network=Network(70b0bd16-afc8-4311-9822-4d1e650336f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1e26fe2-82') Mai 07 19:36:06 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-8bc0d677-83b4-4b77-a7ae-d4cbfd9de5ce req-2e34934e-2a1e-4ed4-bfe5-f07f16a28c13 service nova] Releasing lock "refresh_cache-0a37052d-f36d-4e75-8464-1c227b87d6e5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:36:06 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquired lock "refresh_cache-0a37052d-f36d-4e75-8464-1c227b87d6e5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:36:06 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:36:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:06 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:36:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:06 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Updating instance_info_cache with network_info: [{"id": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "address": "fa:16:3e:fd:ad:ab", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf792953-34", "ovs_interfaceid": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] No VIF found with MAC fa:16:3e:0c:05:04, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Releasing lock "refresh_cache-0a37052d-f36d-4e75-8464-1c227b87d6e5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Instance network_info: |[{"id": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "address": "fa:16:3e:fd:ad:ab", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf792953-34", "ovs_interfaceid": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2163}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Start _get_guest_xml network_info=[{"id": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "address": "fa:16:3e:fd:ad:ab", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf792953-34", "ovs_interfaceid": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'image_id': 'e8633b10-b98a-4580-90f8-3091ca40fa29'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8192}} Mai 07 19:36:07 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='e8633b10-b98a-4580-90f8-3091ca40fa29', instance_meta=NovaInstanceMeta(name='tempest-AttachVolumeNegativeTest-server-1692316619', uuid='0a37052d-f36d-4e75-8464-1c227b87d6e5'), owner=OwnerMeta(userid='d98e17081ef14e01bf76138813a4d56a', username='tempest-AttachVolumeNegativeTest-429871213-project-member', projectid='b1ea2fed9f654419a4de1a6168d279ab', projectname='tempest-AttachVolumeNegativeTest-429871213'), image=ImageMeta(id='e8633b10-b98a-4580-90f8-3091ca40fa29', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "address": "fa:16:3e:fd:ad:ab", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf792953-34", "ovs_interfaceid": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175367.8031113) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Searching host: 'devstack' for CPU controller through CGroups V1... {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1783}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CPU controller missing on host. {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1793}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Searching host: 'devstack' for CPU controller through CGroups V2... {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1802}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CPU controller found on host. {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1809}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Getting desirable topologies for flavor Flavor(created_at=2026-05-07T17:26:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=192,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=), allow threads: True {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:703}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Flavor limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:488}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Image limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:492}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Flavor pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:528}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Image pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:532}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:570}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:709}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:611}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Got 1 possible topologies {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:641}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:715}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:717}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:35:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1692316619',display_name='tempest-AttachVolumeNegativeTest-server-1692316619',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumenegativetest-server-1692316619',id=9,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMRJyleaaRVCSygfNfYeqjuLjImZnBat2EXS6jXv1VB4Tp8h8P55r2Mi1c2YWsAqlGrpPEDV0tpOF4jJYHyqYLLH2xZ41A5TsjtEBubw66zEoAcgO9q31yy943+qFNurJw==',key_name='tempest-keypair-1060373517',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1ea2fed9f654419a4de1a6168d279ab',ramdisk_id='',reservation_id='r-4zcirlhh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-429871213',owner_user_name='tempest-AttachVolumeNegativeTest-429871213-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:36:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d98e17081ef14e01bf76138813a4d56a',uuid=0a37052d-f36d-4e75-8464-1c227b87d6e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "address": "fa:16:3e:fd:ad:ab", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf792953-34", "ovs_interfaceid": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converting VIF {"id": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "address": "fa:16:3e:fd:ad:ab", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf792953-34", "ovs_interfaceid": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:ad:ab,bridge_name='br-int',has_traffic_filtering=True,id=cf792953-3429-4182-ba5e-0b8d056bc7e9,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf792953-34') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:36:07 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lazy-loading 'pci_devices' on Instance uuid 0a37052d-f36d-4e75-8464-1c227b87d6e5 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] End _get_guest_xml xml= Mai 07 19:36:08 devstack nova-compute[86443]: 0a37052d-f36d-4e75-8464-1c227b87d6e5 Mai 07 19:36:08 devstack nova-compute[86443]: instance-00000009 Mai 07 19:36:08 devstack nova-compute[86443]: 196608 Mai 07 19:36:08 devstack nova-compute[86443]: 1 Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: tempest-AttachVolumeNegativeTest-server-1692316619 Mai 07 19:36:08 devstack nova-compute[86443]: 2026-05-07 17:36:07 Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: 192 Mai 07 19:36:08 devstack nova-compute[86443]: 1 Mai 07 19:36:08 devstack nova-compute[86443]: 0 Mai 07 19:36:08 devstack nova-compute[86443]: 0 Mai 07 19:36:08 devstack nova-compute[86443]: 1 Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: True Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: bare Mai 07 19:36:08 devstack nova-compute[86443]: qcow2 Mai 07 19:36:08 devstack nova-compute[86443]: 1 Mai 07 19:36:08 devstack nova-compute[86443]: 0 Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: virtio Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: tempest-AttachVolumeNegativeTest-429871213-project-member Mai 07 19:36:08 devstack nova-compute[86443]: tempest-AttachVolumeNegativeTest-429871213 Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: OpenStack Foundation Mai 07 19:36:08 devstack nova-compute[86443]: OpenStack Nova Mai 07 19:36:08 devstack nova-compute[86443]: 33.1.0 Mai 07 19:36:08 devstack nova-compute[86443]: 0a37052d-f36d-4e75-8464-1c227b87d6e5 Mai 07 19:36:08 devstack nova-compute[86443]: 0a37052d-f36d-4e75-8464-1c227b87d6e5 Mai 07 19:36:08 devstack nova-compute[86443]: Virtual Machine Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: hvm Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: 1 Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: /dev/urandom Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: Mai 07 19:36:08 devstack nova-compute[86443]: {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8199}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Preparing to wait for external event network-vif-plugged-cf792953-3429-4182-ba5e-0b8d056bc7e9 {{(pid=86443) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:306}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.005s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:35:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1692316619',display_name='tempest-AttachVolumeNegativeTest-server-1692316619',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumenegativetest-server-1692316619',id=9,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMRJyleaaRVCSygfNfYeqjuLjImZnBat2EXS6jXv1VB4Tp8h8P55r2Mi1c2YWsAqlGrpPEDV0tpOF4jJYHyqYLLH2xZ41A5TsjtEBubw66zEoAcgO9q31yy943+qFNurJw==',key_name='tempest-keypair-1060373517',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1ea2fed9f654419a4de1a6168d279ab',ramdisk_id='',reservation_id='r-4zcirlhh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-429871213',owner_user_name='tempest-AttachVolumeNegativeTest-429871213-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:36:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d98e17081ef14e01bf76138813a4d56a',uuid=0a37052d-f36d-4e75-8464-1c227b87d6e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "address": "fa:16:3e:fd:ad:ab", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf792953-34", "ovs_interfaceid": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) plug /opt/stack/nova/nova/virt/libvirt/vif.py:763}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converting VIF {"id": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "address": "fa:16:3e:fd:ad:ab", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf792953-34", "ovs_interfaceid": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:ad:ab,bridge_name='br-int',has_traffic_filtering=True,id=cf792953-3429-4182-ba5e-0b8d056bc7e9,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf792953-34') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG os_vif [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:ad:ab,bridge_name='br-int',has_traffic_filtering=True,id=cf792953-3429-4182-ba5e-0b8d056bc7e9,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf792953-34') {{(pid=86443) plug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:76}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '680ecf50-4b8a-5ad7-8319-592cdab0ae62', '_type': 'linux-noop'}}, row=False) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcf792953-34, may_exist=True, interface_attrs={}) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapcf792953-34, col_values=(('qos', UUID('ccb3aaa4-1862-4b9a-9f02-7b7fcbb24a21')),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapcf792953-34, col_values=(('external_ids', {'iface-id': 'cf792953-3429-4182-ba5e-0b8d056bc7e9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:ad:ab', 'vm-uuid': '0a37052d-f36d-4e75-8464-1c227b87d6e5'}),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:36:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:08 devstack nova-compute[86443]: INFO os_vif [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:ad:ab,bridge_name='br-int',has_traffic_filtering=True,id=cf792953-3429-4182-ba5e-0b8d056bc7e9,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf792953-34') Mai 07 19:36:09 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Started> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:36:09 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] VM Started (Lifecycle Event) Mai 07 19:36:09 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-781a07e2-8af5-4aab-b2f8-a99850681b81 req-4c4d1da4-c45d-4680-8cb1-c55e7efed86a service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Received event network-vif-unplugged-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:36:09 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-781a07e2-8af5-4aab-b2f8-a99850681b81 req-4c4d1da4-c45d-4680-8cb1-c55e7efed86a service nova] Acquiring lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:09 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-781a07e2-8af5-4aab-b2f8-a99850681b81 req-4c4d1da4-c45d-4680-8cb1-c55e7efed86a service nova] Lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:09 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-781a07e2-8af5-4aab-b2f8-a99850681b81 req-4c4d1da4-c45d-4680-8cb1-c55e7efed86a service nova] Lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:09 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-781a07e2-8af5-4aab-b2f8-a99850681b81 req-4c4d1da4-c45d-4680-8cb1-c55e7efed86a service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] No event matching network-vif-unplugged-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 in dict_keys([('network-vif-plugged', 'd1e26fe2-8285-4e9b-9bcc-7acb4ff87129')]) {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:348}} Mai 07 19:36:09 devstack nova-compute[86443]: WARNING nova.compute.manager [req-781a07e2-8af5-4aab-b2f8-a99850681b81 req-4c4d1da4-c45d-4680-8cb1-c55e7efed86a service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Received unexpected event network-vif-unplugged-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 for instance with vm_state building and task_state spawning. Mai 07 19:36:09 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:36:09 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Paused> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:36:09 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] VM Paused (Lifecycle Event) Mai 07 19:36:10 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:36:10 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:36:10 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:36:10 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] No VIF found with MAC fa:16:3e:fd:ad:ab, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:36:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:10 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] During sync_power_state the instance has a pending task (spawning). Skip. Mai 07 19:36:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-58468be2-ceb2-4127-9d2e-05bd0b202679 req-37994892-b975-4804-9e19-37de06ba1b53 service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Received event network-vif-plugged-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-58468be2-ceb2-4127-9d2e-05bd0b202679 req-37994892-b975-4804-9e19-37de06ba1b53 service nova] Acquiring lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-58468be2-ceb2-4127-9d2e-05bd0b202679 req-37994892-b975-4804-9e19-37de06ba1b53 service nova] Lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-58468be2-ceb2-4127-9d2e-05bd0b202679 req-37994892-b975-4804-9e19-37de06ba1b53 service nova] Lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-58468be2-ceb2-4127-9d2e-05bd0b202679 req-37994892-b975-4804-9e19-37de06ba1b53 service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Processing event network-vif-plugged-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11746}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-58468be2-ceb2-4127-9d2e-05bd0b202679 req-37994892-b975-4804-9e19-37de06ba1b53 service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Received event network-vif-plugged-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-58468be2-ceb2-4127-9d2e-05bd0b202679 req-37994892-b975-4804-9e19-37de06ba1b53 service nova] Acquiring lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-58468be2-ceb2-4127-9d2e-05bd0b202679 req-37994892-b975-4804-9e19-37de06ba1b53 service nova] Lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-58468be2-ceb2-4127-9d2e-05bd0b202679 req-37994892-b975-4804-9e19-37de06ba1b53 service nova] Lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-58468be2-ceb2-4127-9d2e-05bd0b202679 req-37994892-b975-4804-9e19-37de06ba1b53 service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] No waiting events found dispatching network-vif-plugged-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:36:11 devstack nova-compute[86443]: WARNING nova.compute.manager [req-58468be2-ceb2-4127-9d2e-05bd0b202679 req-37994892-b975-4804-9e19-37de06ba1b53 service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Received unexpected event network-vif-plugged-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 for instance with vm_state building and task_state spawning. Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Instance event wait completed in 2 seconds for network-vif-plugged {{(pid=86443) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:601}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Resumed> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Guest created on hypervisor {{(pid=86443) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4893}} Mai 07 19:36:11 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] VM Resumed (Lifecycle Event) Mai 07 19:36:11 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Instance spawned successfully. Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1012}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Found default for hw_cdrom_bus of ide {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Found default for hw_disk_bus of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Found default for hw_input_bus of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Found default for hw_pointer_model of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Found default for hw_video_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:11 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Found default for hw_vif_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:12 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] During sync_power_state the instance has a pending task (spawning). Skip. Mai 07 19:36:12 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Started> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:36:12 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] VM Started (Lifecycle Event) Mai 07 19:36:12 devstack nova-compute[86443]: INFO nova.compute.manager [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Took 13.62 seconds to spawn the instance on the hypervisor. Mai 07 19:36:12 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:36:12 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:36:12 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Paused> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:36:12 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] VM Paused (Lifecycle Event) Mai 07 19:36:13 devstack nova-compute[86443]: INFO nova.compute.manager [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Took 19.01 seconds to build instance. Mai 07 19:36:13 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:13 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:36:13 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:36:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5643e467-d7ab-4334-9091-9e1920d6f8ea tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 20.552s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:14 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] During sync_power_state the instance has a pending task (spawning). Skip. Mai 07 19:36:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-6a6c1548-892d-4db9-ad09-c028723db1f8 req-bdeaafb0-7c28-47ff-a417-73058457ad44 service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Received event network-vif-plugged-cf792953-3429-4182-ba5e-0b8d056bc7e9 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:36:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-6a6c1548-892d-4db9-ad09-c028723db1f8 req-bdeaafb0-7c28-47ff-a417-73058457ad44 service nova] Acquiring lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-6a6c1548-892d-4db9-ad09-c028723db1f8 req-bdeaafb0-7c28-47ff-a417-73058457ad44 service nova] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-6a6c1548-892d-4db9-ad09-c028723db1f8 req-bdeaafb0-7c28-47ff-a417-73058457ad44 service nova] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.004s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-6a6c1548-892d-4db9-ad09-c028723db1f8 req-bdeaafb0-7c28-47ff-a417-73058457ad44 service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Processing event network-vif-plugged-cf792953-3429-4182-ba5e-0b8d056bc7e9 {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11746}} Mai 07 19:36:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Instance event wait completed in 2 seconds for network-vif-plugged {{(pid=86443) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:601}} Mai 07 19:36:14 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Resumed> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:36:14 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] VM Resumed (Lifecycle Event) Mai 07 19:36:14 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Guest created on hypervisor {{(pid=86443) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4893}} Mai 07 19:36:14 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Instance spawned successfully. Mai 07 19:36:14 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1012}} Mai 07 19:36:15 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:36:15 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Found default for hw_cdrom_bus of ide {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:15 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Found default for hw_disk_bus of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:15 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Found default for hw_input_bus of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:15 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Found default for hw_pointer_model of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:15 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Found default for hw_video_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:15 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Found default for hw_vif_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:15 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:36:15 devstack nova-compute[86443]: INFO nova.compute.manager [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Took 13.34 seconds to spawn the instance on the hypervisor. Mai 07 19:36:15 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:36:15 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] During sync_power_state the instance has a pending task (spawning). Skip. Mai 07 19:36:16 devstack nova-compute[86443]: INFO nova.compute.manager [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Took 18.84 seconds to build instance. Mai 07 19:36:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6f210625-6d03-4b7a-8251-818b48668668 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 20.415s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:17 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-ca10e3f1-1e7a-422d-a812-07ee0480d50f req-34aabd89-8e22-49dd-aa3c-32db7e72a8a9 service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Received event network-vif-plugged-cf792953-3429-4182-ba5e-0b8d056bc7e9 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:36:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-ca10e3f1-1e7a-422d-a812-07ee0480d50f req-34aabd89-8e22-49dd-aa3c-32db7e72a8a9 service nova] Acquiring lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-ca10e3f1-1e7a-422d-a812-07ee0480d50f req-34aabd89-8e22-49dd-aa3c-32db7e72a8a9 service nova] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-ca10e3f1-1e7a-422d-a812-07ee0480d50f req-34aabd89-8e22-49dd-aa3c-32db7e72a8a9 service nova] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.039s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:17 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-ca10e3f1-1e7a-422d-a812-07ee0480d50f req-34aabd89-8e22-49dd-aa3c-32db7e72a8a9 service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] No waiting events found dispatching network-vif-plugged-cf792953-3429-4182-ba5e-0b8d056bc7e9 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:36:17 devstack nova-compute[86443]: WARNING nova.compute.manager [req-ca10e3f1-1e7a-422d-a812-07ee0480d50f req-34aabd89-8e22-49dd-aa3c-32db7e72a8a9 service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Received unexpected event network-vif-plugged-cf792953-3429-4182-ba5e-0b8d056bc7e9 for instance with vm_state active and task_state None. Mai 07 19:36:18 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:21 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:23 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-a4269e6d-1b63-40df-a576-6e76861641da req-584741f2-dadc-48e4-b276-452a8014ce4a service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Received event network-changed-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:36:23 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-a4269e6d-1b63-40df-a576-6e76861641da req-584741f2-dadc-48e4-b276-452a8014ce4a service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Refreshing instance network info cache due to event network-changed-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:36:23 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-a4269e6d-1b63-40df-a576-6e76861641da req-584741f2-dadc-48e4-b276-452a8014ce4a service nova] Acquiring lock "refresh_cache-21c9b24e-fc56-4f69-8903-64182d970d61" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:36:23 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-a4269e6d-1b63-40df-a576-6e76861641da req-584741f2-dadc-48e4-b276-452a8014ce4a service nova] Acquired lock "refresh_cache-21c9b24e-fc56-4f69-8903-64182d970d61" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:36:23 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-a4269e6d-1b63-40df-a576-6e76861641da req-584741f2-dadc-48e4-b276-452a8014ce4a service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Refreshing network info cache for port d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:36:23 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:23 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-a4269e6d-1b63-40df-a576-6e76861641da req-584741f2-dadc-48e4-b276-452a8014ce4a service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:36:26 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-12fd4c7b-a63a-4de3-bc0e-4e94079e672a req-99acbfb9-2fed-4e4b-9e29-8b7cbcfb3e17 service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Received event network-changed-cf792953-3429-4182-ba5e-0b8d056bc7e9 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:36:26 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-12fd4c7b-a63a-4de3-bc0e-4e94079e672a req-99acbfb9-2fed-4e4b-9e29-8b7cbcfb3e17 service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Refreshing instance network info cache due to event network-changed-cf792953-3429-4182-ba5e-0b8d056bc7e9. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:36:26 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-12fd4c7b-a63a-4de3-bc0e-4e94079e672a req-99acbfb9-2fed-4e4b-9e29-8b7cbcfb3e17 service nova] Acquiring lock "refresh_cache-0a37052d-f36d-4e75-8464-1c227b87d6e5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:36:26 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-12fd4c7b-a63a-4de3-bc0e-4e94079e672a req-99acbfb9-2fed-4e4b-9e29-8b7cbcfb3e17 service nova] Acquired lock "refresh_cache-0a37052d-f36d-4e75-8464-1c227b87d6e5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:36:26 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-12fd4c7b-a63a-4de3-bc0e-4e94079e672a req-99acbfb9-2fed-4e4b-9e29-8b7cbcfb3e17 service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Refreshing network info cache for port cf792953-3429-4182-ba5e-0b8d056bc7e9 {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:36:26 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-a4269e6d-1b63-40df-a576-6e76861641da req-584741f2-dadc-48e4-b276-452a8014ce4a service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:36:26 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-a4269e6d-1b63-40df-a576-6e76861641da req-584741f2-dadc-48e4-b276-452a8014ce4a service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Updated VIF entry in instance network info cache for port d1e26fe2-8285-4e9b-9bcc-7acb4ff87129. {{(pid=86443) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3521}} Mai 07 19:36:26 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-a4269e6d-1b63-40df-a576-6e76861641da req-584741f2-dadc-48e4-b276-452a8014ce4a service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Updating instance_info_cache with network_info: [{"id": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "address": "fa:16:3e:0c:05:04", "network": {"id": "70b0bd16-afc8-4311-9822-4d1e650336f4", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2112651691-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5de43333e69940d19777adfbbc96190f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1e26fe2-82", "ovs_interfaceid": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:36:26 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-12fd4c7b-a63a-4de3-bc0e-4e94079e672a req-99acbfb9-2fed-4e4b-9e29-8b7cbcfb3e17 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:36:26 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:26 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-12fd4c7b-a63a-4de3-bc0e-4e94079e672a req-99acbfb9-2fed-4e4b-9e29-8b7cbcfb3e17 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:36:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-a4269e6d-1b63-40df-a576-6e76861641da req-584741f2-dadc-48e4-b276-452a8014ce4a service nova] Releasing lock "refresh_cache-21c9b24e-fc56-4f69-8903-64182d970d61" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:36:27 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-12fd4c7b-a63a-4de3-bc0e-4e94079e672a req-99acbfb9-2fed-4e4b-9e29-8b7cbcfb3e17 service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Updated VIF entry in instance network info cache for port cf792953-3429-4182-ba5e-0b8d056bc7e9. {{(pid=86443) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3521}} Mai 07 19:36:27 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-12fd4c7b-a63a-4de3-bc0e-4e94079e672a req-99acbfb9-2fed-4e4b-9e29-8b7cbcfb3e17 service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Updating instance_info_cache with network_info: [{"id": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "address": "fa:16:3e:fd:ad:ab", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.83", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf792953-34", "ovs_interfaceid": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:36:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-12fd4c7b-a63a-4de3-bc0e-4e94079e672a req-99acbfb9-2fed-4e4b-9e29-8b7cbcfb3e17 service nova] Releasing lock "refresh_cache-0a37052d-f36d-4e75-8464-1c227b87d6e5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:36:28 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:29 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:31 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:33 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:36 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:36 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:37 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "21c9b24e-fc56-4f69-8903-64182d970d61" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:37 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:38 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lazy-loading 'flavor' on Instance uuid 21c9b24e-fc56-4f69-8903-64182d970d61 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:36:38 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:39 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 1.543s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "21c9b24e-fc56-4f69-8903-64182d970d61" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:40 devstack nova-compute[86443]: INFO nova.compute.manager [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Attaching volume 89981085-b498-4d17-81ff-e5587dae6bb7 to /dev/vdb Mai 07 19:36:40 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.166', 'multipath': False, 'enforce_multipath': True, 'host': 'devstack', 'execute': None}" {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:175}} Mai 07 19:36:40 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.scaleio [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Failed to query sdc guid: [Errno 2] No such file or directory {{(pid=86443) _get_guid /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/scaleio.py:91}} Mai 07 19:36:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Running cmd (subprocess): nvme version {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] 'nvme version' failed. Not Retrying. {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:543}} Mai 07 19:36:40 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.nvmeof [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] nvme not present on system {{(pid=86443) nvme_present /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/nvmeof.py:782}} Mai 07 19:36:40 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] LIGHTOS: [Errno 111] Connection refused {{(pid=86443) find_dsc /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:161}} Mai 07 19:36:40 devstack nova-compute[86443]: INFO os_brick.initiator.connectors.lightos [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Current host hostNQN and IP(s) are ['192.168.122.166', 'fe80::5054:ff:fe02:c903', '172.24.4.1', '2001:db8::2', 'fe80::e4ed:27ff:fed0:dc45', 'fe80::fc16:3eff:fe0c:504', 'fe80::b84a:a3ff:feba:e0b7', 'fe80::fc16:3eff:fefd:adab', 'fe80::4885:dff:fefa:58b3'] Mai 07 19:36:40 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] LIGHTOS: did not find dsc, continuing anyway. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:136}} Mai 07 19:36:40 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] LIGHTOS: no hostnqn found. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:145}} Mai 07 19:36:40 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] <== get_connector_properties: return (179ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.166', 'host': 'devstack', 'multipath': False, 'enforce_multipath': True, 'initiator': 'iqn.2016-04.com.open-iscsi:ab61f14d7e3e', 'do_local_attach': False, 'uuid': 'e51d0ed5-0776-4376-a81f-1e084ffcb1c6', 'system uuid': '1edef36a-6b3a-4b67-b01c-d6a682c117a8', 'nvme_native_multipath': False} {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:202}} Mai 07 19:36:40 devstack nova-compute[86443]: DEBUG nova.virt.block_device [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Updating existing volume attachment record: 5c8e7183-c3ab-43c5-83ef-67fc4cd16dd1 {{(pid=86443) _volume_attach /opt/stack/nova/nova/virt/block_device.py:666}} Mai 07 19:36:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "0a37052d-f36d-4e75-8464-1c227b87d6e5" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:43 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:36:43 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:36:43 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 1.00 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:36:43 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:43 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lazy-loading 'flavor' on Instance uuid 0a37052d-f36d-4e75-8464-1c227b87d6e5 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:36:43 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:43 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:43 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.volume.quobyte [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] systemd detected. {{(pid=86443) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/quobyte.py:167}} Mai 07 19:36:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.volume.quobyte [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Mounting volume osci02.corp.quobyte.com/cinder-vol-1d971a4c-1ce1-46c7-a94d-347e695e16aa at mount point /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22 via systemd-run {{(pid=86443) mount_volume /opt/stack/nova/nova/virt/libvirt/volume/quobyte.py:79}} Mai 07 19:36:44 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:36:44 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:36:44 devstack nova-compute[86443]: INFO nova.virt.libvirt.volume.quobyte [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Mounted volume: osci02.corp.quobyte.com/cinder-vol-1d971a4c-1ce1-46c7-a94d-347e695e16aa Mai 07 19:36:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=86443) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:11402}} Mai 07 19:36:44 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 0.99 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:36:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: held 0.229s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:44 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lazy-loading 'flavor' on Instance uuid 21c9b24e-fc56-4f69-8903-64182d970d61 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:36:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 1.527s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:44 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] attach device xml: Mai 07 19:36:44 devstack nova-compute[86443]: Mai 07 19:36:44 devstack nova-compute[86443]: Mai 07 19:36:44 devstack nova-compute[86443]: Mai 07 19:36:44 devstack nova-compute[86443]: Mai 07 19:36:44 devstack nova-compute[86443]: 89981085-b498-4d17-81ff-e5587dae6bb7 Mai 07 19:36:44 devstack nova-compute[86443]: Mai 07 19:36:44 devstack nova-compute[86443]: {{(pid=86443) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:351}} Mai 07 19:36:45 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:36:45 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 0.99 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:36:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "0a37052d-f36d-4e75-8464-1c227b87d6e5" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:45 devstack nova-compute[86443]: INFO nova.compute.manager [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Attaching volume 2d1f44c8-c445-4b0f-82aa-47a492cccc8f to /dev/vdb Mai 07 19:36:45 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.166', 'multipath': False, 'enforce_multipath': True, 'host': 'devstack', 'execute': None}" {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:175}} Mai 07 19:36:45 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.scaleio [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Failed to query sdc guid: [Errno 2] No such file or directory {{(pid=86443) _get_guid /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/scaleio.py:91}} Mai 07 19:36:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): nvme version {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] 'nvme version' failed. Not Retrying. {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:543}} Mai 07 19:36:45 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.nvmeof [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] nvme not present on system {{(pid=86443) nvme_present /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/nvmeof.py:782}} Mai 07 19:36:45 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] LIGHTOS: [Errno 111] Connection refused {{(pid=86443) find_dsc /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:161}} Mai 07 19:36:45 devstack nova-compute[86443]: INFO os_brick.initiator.connectors.lightos [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Current host hostNQN and IP(s) are ['192.168.122.166', 'fe80::5054:ff:fe02:c903', '172.24.4.1', '2001:db8::2', 'fe80::e4ed:27ff:fed0:dc45', 'fe80::fc16:3eff:fe0c:504', 'fe80::b84a:a3ff:feba:e0b7', 'fe80::fc16:3eff:fefd:adab', 'fe80::4885:dff:fefa:58b3'] Mai 07 19:36:45 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] LIGHTOS: did not find dsc, continuing anyway. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:136}} Mai 07 19:36:45 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] LIGHTOS: no hostnqn found. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:145}} Mai 07 19:36:45 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] <== get_connector_properties: return (115ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.166', 'host': 'devstack', 'multipath': False, 'enforce_multipath': True, 'initiator': 'iqn.2016-04.com.open-iscsi:ab61f14d7e3e', 'do_local_attach': False, 'uuid': 'e51d0ed5-0776-4376-a81f-1e084ffcb1c6', 'system uuid': '1edef36a-6b3a-4b67-b01c-d6a682c117a8', 'nvme_native_multipath': False} {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:202}} Mai 07 19:36:45 devstack nova-compute[86443]: DEBUG nova.virt.block_device [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Updating existing volume attachment record: 238b8612-995d-4933-9354-4105659dff0f {{(pid=86443) _volume_attach /opt/stack/nova/nova/virt/block_device.py:666}} Mai 07 19:36:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Acquiring lock "b1071faa-5a0b-4d52-80a4-eee09d394886" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "b1071faa-5a0b-4d52-80a4-eee09d394886" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:46 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:36:46 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 0.01 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:36:46 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager.update_available_resource {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:36:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:36:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] No BDM found with device name vdb, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:36:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] No VIF found with MAC fa:16:3e:0c:05:04, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:36:46 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Starting instance... {{(pid=86443) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2605}} Mai 07 19:36:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:46 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Auditing locally available compute resources for devstack (node: devstack) {{(pid=86443) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:937}} Mai 07 19:36:46 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.volume.quobyte [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] systemd detected. {{(pid=86443) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/quobyte.py:167}} Mai 07 19:36:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: held 0.006s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:46 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lazy-loading 'flavor' on Instance uuid 0a37052d-f36d-4e75-8464-1c227b87d6e5 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:36:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:47 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=86443) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2630}} Mai 07 19:36:47 devstack nova-compute[86443]: INFO nova.compute.claims [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Claim successful on node devstack Mai 07 19:36:47 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] attach device xml: Mai 07 19:36:47 devstack nova-compute[86443]: Mai 07 19:36:47 devstack nova-compute[86443]: Mai 07 19:36:47 devstack nova-compute[86443]: Mai 07 19:36:47 devstack nova-compute[86443]: Mai 07 19:36:47 devstack nova-compute[86443]: 2d1f44c8-c445-4b0f-82aa-47a492cccc8f Mai 07 19:36:47 devstack nova-compute[86443]: Mai 07 19:36:47 devstack nova-compute[86443]: {{(pid=86443) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:351}} Mai 07 19:36:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/21c9b24e-fc56-4f69-8903-64182d970d61/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/21c9b24e-fc56-4f69-8903-64182d970d61/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:36:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/21c9b24e-fc56-4f69-8903-64182d970d61/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/21c9b24e-fc56-4f69-8903-64182d970d61/disk --force-share --output=json" returned: 0 in 0.154s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:36:47 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] skipping disk /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22/volume-89981085-b498-4d17-81ff-e5587dae6bb7 (vdb) as it is a volume {{(pid=86443) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:12328}} Mai 07 19:36:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e45e9b4-df61-430f-80b1-b9f21a4b7c90 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61" "released" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 7.605s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/disk --force-share --output=json" returned: 0 in 0.166s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:36:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:48 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:36:48 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:36:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/disk --force-share --output=json" returned: 0 in 0.159s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:36:48 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] skipping disk /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22/volume-2d1f44c8-c445-4b0f-82aa-47a492cccc8f (vdb) as it is a volume {{(pid=86443) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:12328}} Mai 07 19:36:48 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:36:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): env LANG=C uptime {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "env LANG=C uptime" returned: 0 in 0.043s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:36:48 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Hypervisor/Node resource view: name=devstack free_ram=4889MB free_disk=14.669292449951172GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_00_0", "address": "0000:02:00.0", "product_id": "000d", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000d", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1111", "vendor_id": "1234", "numa_node": null, "label": "label_1234_1111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1043", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1043", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] {{(pid=86443) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1136}} Mai 07 19:36:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:48 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:48 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:36:49 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:36:49 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] No BDM found with device name vdb, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:36:49 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] No VIF found with MAC fa:16:3e:fd:ad:ab, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:36:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.261s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:49 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Start building networks asynchronously for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3003}} Mai 07 19:36:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.943s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:49 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Allocating IP information in the background. {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2148}} Mai 07 19:36:49 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] allocate_for_instance() {{(pid=86443) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1187}} Mai 07 19:36:49 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:36:49 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:36:49 devstack nova-compute[86443]: DEBUG nova.policy [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '985731d9531c4316a1bc342b3bf7e8ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0ac6bf62a8e14b69b4016786de05abd9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=86443) authorize /opt/stack/nova/nova/policy.py:192}} Mai 07 19:36:50 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Mai 07 19:36:50 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Instance 21c9b24e-fc56-4f69-8903-64182d970d61 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=86443) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1740}} Mai 07 19:36:50 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Instance 0a37052d-f36d-4e75-8464-1c227b87d6e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=86443) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1740}} Mai 07 19:36:50 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Instance b1071faa-5a0b-4d52-80a4-eee09d394886 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=86443) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1740}} Mai 07 19:36:50 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Total usable vcpus: 4, total allocated vcpus: 3 {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1159}} Mai 07 19:36:50 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Final resource view: name=devstack phys_ram=11961MB used_ram=1088MB phys_disk=25GB used_disk=3GB total_vcpus=4 used_vcpus=3 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:36:48 up 36 min, 1 user, load average: 8.55, 6.14, 3.62\n', 'num_instances': '3', 'num_vm_active': '2', 'num_task_None': '3', 'num_os_type_None': '3', 'num_proj_5de43333e69940d19777adfbbc96190f': '1', 'io_workload': '1', 'num_proj_b1ea2fed9f654419a4de1a6168d279ab': '1', 'num_vm_building': '1', 'num_proj_0ac6bf62a8e14b69b4016786de05abd9': '1'} {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1168}} Mai 07 19:36:50 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:36:50 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:36:50 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-853fc655-b8a8-45b1-848f-735fe04abc8b tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5" "released" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 5.187s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:50 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Start building block device mappings for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3038}} Mai 07 19:36:51 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:36:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "21c9b24e-fc56-4f69-8903-64182d970d61" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:51 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Successfully created port: de56e1ed-660f-4a87-9a44-b7a9976c6dd7 {{(pid=86443) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:529}} Mai 07 19:36:51 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Compute_service record updated for devstack:devstack {{(pid=86443) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1097}} Mai 07 19:36:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.286s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:51 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 1.00 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:36:51 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lazy-loading 'flavor' on Instance uuid 21c9b24e-fc56-4f69-8903-64182d970d61 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:36:51 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Start spawning the instance on the hypervisor. {{(pid=86443) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2811}} Mai 07 19:36:51 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Creating instance directory {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5215}} Mai 07 19:36:51 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Creating image(s) Mai 07 19:36:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Acquiring lock "/opt/stack/data/nova/instances/b1071faa-5a0b-4d52-80a4-eee09d394886/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "/opt/stack/data/nova/instances/b1071faa-5a0b-4d52-80a4-eee09d394886/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "/opt/stack/data/nova/instances/b1071faa-5a0b-4d52-80a4-eee09d394886/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:51 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:36:51 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:36:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.172s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Acquiring lock "d8d56ca44922efe85609619d01052c20f44c056a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.157s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/b1071faa-5a0b-4d52-80a4-eee09d394886/disk 1073741824 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/b1071faa-5a0b-4d52-80a4-eee09d394886/disk 1073741824" returned: 0 in 0.061s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.240s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.143s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Checking if we can resize image /opt/stack/data/nova/instances/b1071faa-5a0b-4d52-80a4-eee09d394886/disk. size=1073741824 {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:178}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b1071faa-5a0b-4d52-80a4-eee09d394886/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Successfully updated port: de56e1ed-660f-4a87-9a44-b7a9976c6dd7 {{(pid=86443) _update_port /opt/stack/nova/nova/network/neutron.py:567}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 51.48 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b1071faa-5a0b-4d52-80a4-eee09d394886/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Cannot resize image /opt/stack/data/nova/instances/b1071faa-5a0b-4d52-80a4-eee09d394886/disk to a smaller size. {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:184}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Created local disks {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5347}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Ensure instance console log exists: /opt/stack/data/nova/instances/b1071faa-5a0b-4d52-80a4-eee09d394886/console.log {{(pid=86443) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:5094}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 1.550s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-03dda3a0-f428-4df7-b268-7b3d2c7375b5 req-85736afa-8f52-40b5-9bc1-8d26cc1fb812 service nova] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Received event network-changed-de56e1ed-660f-4a87-9a44-b7a9976c6dd7 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-03dda3a0-f428-4df7-b268-7b3d2c7375b5 req-85736afa-8f52-40b5-9bc1-8d26cc1fb812 service nova] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Refreshing instance network info cache due to event network-changed-de56e1ed-660f-4a87-9a44-b7a9976c6dd7. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-03dda3a0-f428-4df7-b268-7b3d2c7375b5 req-85736afa-8f52-40b5-9bc1-8d26cc1fb812 service nova] Acquiring lock "refresh_cache-b1071faa-5a0b-4d52-80a4-eee09d394886" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-03dda3a0-f428-4df7-b268-7b3d2c7375b5 req-85736afa-8f52-40b5-9bc1-8d26cc1fb812 service nova] Acquired lock "refresh_cache-b1071faa-5a0b-4d52-80a4-eee09d394886" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:36:52 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-03dda3a0-f428-4df7-b268-7b3d2c7375b5 req-85736afa-8f52-40b5-9bc1-8d26cc1fb812 service nova] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Refreshing network info cache for port de56e1ed-660f-4a87-9a44-b7a9976c6dd7 {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Acquiring lock "refresh_cache-b1071faa-5a0b-4d52-80a4-eee09d394886" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:36:53 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-03dda3a0-f428-4df7-b268-7b3d2c7375b5 req-85736afa-8f52-40b5-9bc1-8d26cc1fb812 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-03dda3a0-f428-4df7-b268-7b3d2c7375b5 req-85736afa-8f52-40b5-9bc1-8d26cc1fb812 service nova] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-03dda3a0-f428-4df7-b268-7b3d2c7375b5 req-85736afa-8f52-40b5-9bc1-8d26cc1fb812 service nova] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "21c9b24e-fc56-4f69-8903-64182d970d61" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:53 devstack nova-compute[86443]: INFO nova.compute.manager [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Attaching volume 4ca90cda-cd50-4fe8-abad-c1107b00e6d8 to /dev/vdc Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.166', 'multipath': False, 'enforce_multipath': True, 'host': 'devstack', 'execute': None}" {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:175}} Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.scaleio [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Failed to query sdc guid: [Errno 2] No such file or directory {{(pid=86443) _get_guid /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/scaleio.py:91}} Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Running cmd (subprocess): nvme version {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] 'nvme version' failed. Not Retrying. {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:543}} Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.nvmeof [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] nvme not present on system {{(pid=86443) nvme_present /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/nvmeof.py:782}} Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] LIGHTOS: [Errno 111] Connection refused {{(pid=86443) find_dsc /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:161}} Mai 07 19:36:53 devstack nova-compute[86443]: INFO os_brick.initiator.connectors.lightos [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Current host hostNQN and IP(s) are ['192.168.122.166', 'fe80::5054:ff:fe02:c903', '172.24.4.1', '2001:db8::2', 'fe80::e4ed:27ff:fed0:dc45', 'fe80::fc16:3eff:fe0c:504', 'fe80::b84a:a3ff:feba:e0b7', 'fe80::fc16:3eff:fefd:adab', 'fe80::4885:dff:fefa:58b3'] Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] LIGHTOS: did not find dsc, continuing anyway. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:136}} Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] LIGHTOS: no hostnqn found. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:145}} Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] <== get_connector_properties: return (145ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.166', 'host': 'devstack', 'multipath': False, 'enforce_multipath': True, 'initiator': 'iqn.2016-04.com.open-iscsi:ab61f14d7e3e', 'do_local_attach': False, 'uuid': 'e51d0ed5-0776-4376-a81f-1e084ffcb1c6', 'system uuid': '1edef36a-6b3a-4b67-b01c-d6a682c117a8', 'nvme_native_multipath': False} {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:202}} Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG nova.virt.block_device [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Updating existing volume attachment record: e8d0dc6e-c916-43b9-a7e3-47477ffaaf12 {{(pid=86443) _volume_attach /opt/stack/nova/nova/virt/block_device.py:666}} Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-03dda3a0-f428-4df7-b268-7b3d2c7375b5 req-85736afa-8f52-40b5-9bc1-8d26cc1fb812 service nova] Releasing lock "refresh_cache-b1071faa-5a0b-4d52-80a4-eee09d394886" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:36:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Acquired lock "refresh_cache-b1071faa-5a0b-4d52-80a4-eee09d394886" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:36:54 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:36:54 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:36:54 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:36:54 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [-] Fixed interval looping call 'nova.servicegroup.drivers.db.DbDriver._report_state' sleeping for 119.50 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:36:54 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Updating instance_info_cache with network_info: [{"id": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "address": "fa:16:3e:d1:33:81", "network": {"id": "c2d4c0f6-c2c9-48b7-9c2a-f6fc10a00679", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-133065592-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0ac6bf62a8e14b69b4016786de05abd9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde56e1ed-66", "ovs_interfaceid": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:36:54 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:54 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Releasing lock "refresh_cache-b1071faa-5a0b-4d52-80a4-eee09d394886" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Instance network_info: |[{"id": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "address": "fa:16:3e:d1:33:81", "network": {"id": "c2d4c0f6-c2c9-48b7-9c2a-f6fc10a00679", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-133065592-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0ac6bf62a8e14b69b4016786de05abd9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde56e1ed-66", "ovs_interfaceid": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2163}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Start _get_guest_xml network_info=[{"id": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "address": "fa:16:3e:d1:33:81", "network": {"id": "c2d4c0f6-c2c9-48b7-9c2a-f6fc10a00679", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-133065592-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0ac6bf62a8e14b69b4016786de05abd9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde56e1ed-66", "ovs_interfaceid": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'image_id': 'e8633b10-b98a-4580-90f8-3091ca40fa29'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8192}} Mai 07 19:36:55 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='e8633b10-b98a-4580-90f8-3091ca40fa29', instance_meta=NovaInstanceMeta(name='tempest-VolumesAdminNegativeTest-server-1121917553', uuid='b1071faa-5a0b-4d52-80a4-eee09d394886'), owner=OwnerMeta(userid='985731d9531c4316a1bc342b3bf7e8ce', username='tempest-VolumesAdminNegativeTest-915880522-project-member', projectid='0ac6bf62a8e14b69b4016786de05abd9', projectname='tempest-VolumesAdminNegativeTest-915880522'), image=ImageMeta(id='e8633b10-b98a-4580-90f8-3091ca40fa29', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "address": "fa:16:3e:d1:33:81", "network": {"id": "c2d4c0f6-c2c9-48b7-9c2a-f6fc10a00679", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-133065592-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0ac6bf62a8e14b69b4016786de05abd9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde56e1ed-66", "ovs_interfaceid": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175415.3918512) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Searching host: 'devstack' for CPU controller through CGroups V1... {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1783}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] CPU controller missing on host. {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1793}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Searching host: 'devstack' for CPU controller through CGroups V2... {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1802}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] CPU controller found on host. {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1809}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Getting desirable topologies for flavor Flavor(created_at=2026-05-07T17:26:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=192,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=), allow threads: True {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:703}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Flavor limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:488}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Image limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:492}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Flavor pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:528}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Image pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:532}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:570}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:709}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:611}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Got 1 possible topologies {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:641}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:715}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:717}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:36:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1121917553',display_name='tempest-VolumesAdminNegativeTest-server-1121917553',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-volumesadminnegativetest-server-1121917553',id=10,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0ac6bf62a8e14b69b4016786de05abd9',ramdisk_id='',reservation_id='r-7fdt4d9y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-915880522',owner_user_name='tempest-VolumesAdminNegativeTest-915880522-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:36:51Z,user_data=None,user_id='985731d9531c4316a1bc342b3bf7e8ce',uuid=b1071faa-5a0b-4d52-80a4-eee09d394886,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "address": "fa:16:3e:d1:33:81", "network": {"id": "c2d4c0f6-c2c9-48b7-9c2a-f6fc10a00679", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-133065592-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0ac6bf62a8e14b69b4016786de05abd9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde56e1ed-66", "ovs_interfaceid": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Converting VIF {"id": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "address": "fa:16:3e:d1:33:81", "network": {"id": "c2d4c0f6-c2c9-48b7-9c2a-f6fc10a00679", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-133065592-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0ac6bf62a8e14b69b4016786de05abd9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde56e1ed-66", "ovs_interfaceid": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:33:81,bridge_name='br-int',has_traffic_filtering=True,id=de56e1ed-660f-4a87-9a44-b7a9976c6dd7,network=Network(c2d4c0f6-c2c9-48b7-9c2a-f6fc10a00679),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde56e1ed-66') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lazy-loading 'pci_devices' on Instance uuid b1071faa-5a0b-4d52-80a4-eee09d394886 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Starting instance... {{(pid=86443) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2605}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.volume.quobyte [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] systemd detected. {{(pid=86443) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/quobyte.py:167}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: held 0.004s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lazy-loading 'flavor' on Instance uuid 21c9b24e-fc56-4f69-8903-64182d970d61 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] End _get_guest_xml xml= Mai 07 19:36:55 devstack nova-compute[86443]: b1071faa-5a0b-4d52-80a4-eee09d394886 Mai 07 19:36:55 devstack nova-compute[86443]: instance-0000000a Mai 07 19:36:55 devstack nova-compute[86443]: 196608 Mai 07 19:36:55 devstack nova-compute[86443]: 1 Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: tempest-VolumesAdminNegativeTest-server-1121917553 Mai 07 19:36:55 devstack nova-compute[86443]: 2026-05-07 17:36:55 Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: 192 Mai 07 19:36:55 devstack nova-compute[86443]: 1 Mai 07 19:36:55 devstack nova-compute[86443]: 0 Mai 07 19:36:55 devstack nova-compute[86443]: 0 Mai 07 19:36:55 devstack nova-compute[86443]: 1 Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: True Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: bare Mai 07 19:36:55 devstack nova-compute[86443]: qcow2 Mai 07 19:36:55 devstack nova-compute[86443]: 1 Mai 07 19:36:55 devstack nova-compute[86443]: 0 Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: virtio Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: tempest-VolumesAdminNegativeTest-915880522-project-member Mai 07 19:36:55 devstack nova-compute[86443]: tempest-VolumesAdminNegativeTest-915880522 Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: OpenStack Foundation Mai 07 19:36:55 devstack nova-compute[86443]: OpenStack Nova Mai 07 19:36:55 devstack nova-compute[86443]: 33.1.0 Mai 07 19:36:55 devstack nova-compute[86443]: b1071faa-5a0b-4d52-80a4-eee09d394886 Mai 07 19:36:55 devstack nova-compute[86443]: b1071faa-5a0b-4d52-80a4-eee09d394886 Mai 07 19:36:55 devstack nova-compute[86443]: Virtual Machine Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: hvm Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: 1 Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: /dev/urandom Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: Mai 07 19:36:55 devstack nova-compute[86443]: {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8199}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Preparing to wait for external event network-vif-plugged-de56e1ed-660f-4a87-9a44-b7a9976c6dd7 {{(pid=86443) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:306}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Acquiring lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:36:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1121917553',display_name='tempest-VolumesAdminNegativeTest-server-1121917553',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-volumesadminnegativetest-server-1121917553',id=10,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0ac6bf62a8e14b69b4016786de05abd9',ramdisk_id='',reservation_id='r-7fdt4d9y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-915880522',owner_user_name='tempest-VolumesAdminNegativeTest-915880522-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:36:51Z,user_data=None,user_id='985731d9531c4316a1bc342b3bf7e8ce',uuid=b1071faa-5a0b-4d52-80a4-eee09d394886,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "address": "fa:16:3e:d1:33:81", "network": {"id": "c2d4c0f6-c2c9-48b7-9c2a-f6fc10a00679", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-133065592-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0ac6bf62a8e14b69b4016786de05abd9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde56e1ed-66", "ovs_interfaceid": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) plug /opt/stack/nova/nova/virt/libvirt/vif.py:763}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Converting VIF {"id": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "address": "fa:16:3e:d1:33:81", "network": {"id": "c2d4c0f6-c2c9-48b7-9c2a-f6fc10a00679", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-133065592-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0ac6bf62a8e14b69b4016786de05abd9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde56e1ed-66", "ovs_interfaceid": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:33:81,bridge_name='br-int',has_traffic_filtering=True,id=de56e1ed-660f-4a87-9a44-b7a9976c6dd7,network=Network(c2d4c0f6-c2c9-48b7-9c2a-f6fc10a00679),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde56e1ed-66') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG os_vif [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:33:81,bridge_name='br-int',has_traffic_filtering=True,id=de56e1ed-660f-4a87-9a44-b7a9976c6dd7,network=Network(c2d4c0f6-c2c9-48b7-9c2a-f6fc10a00679),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde56e1ed-66') {{(pid=86443) plug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:76}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '8258c375-d903-5a26-a434-1c21592ac98e', '_type': 'linux-noop'}}, row=False) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapde56e1ed-66, may_exist=True, interface_attrs={}) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapde56e1ed-66, col_values=(('qos', UUID('cd7dfde7-80d6-4af3-ab7e-534fe9196cc5')),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapde56e1ed-66, col_values=(('external_ids', {'iface-id': 'de56e1ed-660f-4a87-9a44-b7a9976c6dd7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:33:81', 'vm-uuid': 'b1071faa-5a0b-4d52-80a4-eee09d394886'}),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:55 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:36:56 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:56 devstack nova-compute[86443]: INFO os_vif [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:33:81,bridge_name='br-int',has_traffic_filtering=True,id=de56e1ed-660f-4a87-9a44-b7a9976c6dd7,network=Network(c2d4c0f6-c2c9-48b7-9c2a-f6fc10a00679),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde56e1ed-66') Mai 07 19:36:56 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:56 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:56 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=86443) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2630}} Mai 07 19:36:56 devstack nova-compute[86443]: INFO nova.compute.claims [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Claim successful on node devstack Mai 07 19:36:56 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] attach device xml: Mai 07 19:36:56 devstack nova-compute[86443]: Mai 07 19:36:56 devstack nova-compute[86443]: Mai 07 19:36:56 devstack nova-compute[86443]: Mai 07 19:36:56 devstack nova-compute[86443]: Mai 07 19:36:56 devstack nova-compute[86443]: 4ca90cda-cd50-4fe8-abad-c1107b00e6d8 Mai 07 19:36:56 devstack nova-compute[86443]: Mai 07 19:36:56 devstack nova-compute[86443]: {{(pid=86443) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:351}} Mai 07 19:36:56 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:57 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:36:57 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:36:57 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:36:57 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] No VIF found with MAC fa:16:3e:d1:33:81, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:36:57 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:57 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:57 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:57 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:36:57 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:36:57 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] No BDM found with device name vdb, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:36:57 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] No BDM found with device name vdc, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:36:57 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] No VIF found with MAC fa:16:3e:0c:05:04, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-a2986617-e821-49f5-97e0-159c80e85071 req-75c025c4-d85a-4591-b406-0b20545946f7 service nova] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Received event network-vif-plugged-de56e1ed-660f-4a87-9a44-b7a9976c6dd7 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-a2986617-e821-49f5-97e0-159c80e85071 req-75c025c4-d85a-4591-b406-0b20545946f7 service nova] Acquiring lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-a2986617-e821-49f5-97e0-159c80e85071 req-75c025c4-d85a-4591-b406-0b20545946f7 service nova] Lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-a2986617-e821-49f5-97e0-159c80e85071 req-75c025c4-d85a-4591-b406-0b20545946f7 service nova] Lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-a2986617-e821-49f5-97e0-159c80e85071 req-75c025c4-d85a-4591-b406-0b20545946f7 service nova] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Processing event network-vif-plugged-de56e1ed-660f-4a87-9a44-b7a9976c6dd7 {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11746}} Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.211s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Start building networks asynchronously for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3003}} Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Allocating IP information in the background. {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2148}} Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] allocate_for_instance() {{(pid=86443) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1187}} Mai 07 19:36:58 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:36:58 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG nova.policy [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd98e17081ef14e01bf76138813a4d56a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1ea2fed9f654419a4de1a6168d279ab', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=86443) authorize /opt/stack/nova/nova/policy.py:192}} Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=86443) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:601}} Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Started> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:36:58 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] VM Started (Lifecycle Event) Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Guest created on hypervisor {{(pid=86443) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4893}} Mai 07 19:36:58 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Instance spawned successfully. Mai 07 19:36:58 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1012}} Mai 07 19:36:59 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Mai 07 19:36:59 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:36:59 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:36:59 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Found default for hw_cdrom_bus of ide {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:59 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Found default for hw_disk_bus of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:59 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Found default for hw_input_bus of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:59 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Found default for hw_pointer_model of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:59 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Found default for hw_video_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:59 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Found default for hw_vif_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:36:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-f6fd4df2-238d-4179-abb5-b4a1eacf418e tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61" "released" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 5.847s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:36:59 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Start building block device mappings for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3038}} Mai 07 19:36:59 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] During sync_power_state the instance has a pending task (spawning). Skip. Mai 07 19:36:59 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Paused> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:36:59 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] VM Paused (Lifecycle Event) Mai 07 19:37:00 devstack nova-compute[86443]: INFO nova.compute.manager [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Took 8.20 seconds to spawn the instance on the hypervisor. Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-1842529b-ca69-486a-a924-8ee89ea90601 req-f8e00b35-02a3-40ca-a695-0f5557f7186a service nova] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Received event network-vif-plugged-de56e1ed-660f-4a87-9a44-b7a9976c6dd7 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-1842529b-ca69-486a-a924-8ee89ea90601 req-f8e00b35-02a3-40ca-a695-0f5557f7186a service nova] Acquiring lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-1842529b-ca69-486a-a924-8ee89ea90601 req-f8e00b35-02a3-40ca-a695-0f5557f7186a service nova] Lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-1842529b-ca69-486a-a924-8ee89ea90601 req-f8e00b35-02a3-40ca-a695-0f5557f7186a service nova] Lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-1842529b-ca69-486a-a924-8ee89ea90601 req-f8e00b35-02a3-40ca-a695-0f5557f7186a service nova] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] No waiting events found dispatching network-vif-plugged-de56e1ed-660f-4a87-9a44-b7a9976c6dd7 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Successfully created port: 6e136c5c-f540-4142-8d9a-d33072d905cc {{(pid=86443) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:529}} Mai 07 19:37:00 devstack nova-compute[86443]: WARNING nova.compute.manager [req-1842529b-ca69-486a-a924-8ee89ea90601 req-f8e00b35-02a3-40ca-a695-0f5557f7186a service nova] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Received unexpected event network-vif-plugged-de56e1ed-660f-4a87-9a44-b7a9976c6dd7 for instance with vm_state active and task_state None. Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Resumed> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:37:00 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] VM Resumed (Lifecycle Event) Mai 07 19:37:00 devstack nova-compute[86443]: INFO nova.compute.manager [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Took 13.62 seconds to build instance. Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Start spawning the instance on the hypervisor. {{(pid=86443) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2811}} Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Creating instance directory {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5215}} Mai 07 19:37:00 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Creating image(s) Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "/opt/stack/data/nova/instances/6fc39b29-fdc4-4758-ad2d-2cef146465ac/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "/opt/stack/data/nova/instances/6fc39b29-fdc4-4758-ad2d-2cef146465ac/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "/opt/stack/data/nova/instances/6fc39b29-fdc4-4758-ad2d-2cef146465ac/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.007s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:37:00 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.168s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "d8d56ca44922efe85609619d01052c20f44c056a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a6e6304b-932b-40e3-9264-09f721aed772 tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "b1071faa-5a0b-4d52-80a4-eee09d394886" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.148s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.173s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/6fc39b29-fdc4-4758-ad2d-2cef146465ac/disk 1073741824 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/6fc39b29-fdc4-4758-ad2d-2cef146465ac/disk 1073741824" returned: 0 in 0.127s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.349s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.127s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Checking if we can resize image /opt/stack/data/nova/instances/6fc39b29-fdc4-4758-ad2d-2cef146465ac/disk. size=1073741824 {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:178}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6fc39b29-fdc4-4758-ad2d-2cef146465ac/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6fc39b29-fdc4-4758-ad2d-2cef146465ac/disk --force-share --output=json" returned: 0 in 0.278s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Cannot resize image /opt/stack/data/nova/instances/6fc39b29-fdc4-4758-ad2d-2cef146465ac/disk to a smaller size. {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:184}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Created local disks {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5347}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Ensure instance console log exists: /opt/stack/data/nova/instances/6fc39b29-fdc4-4758-ad2d-2cef146465ac/console.log {{(pid=86443) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:5094}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "21c9b24e-fc56-4f69-8903-64182d970d61" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61" acquired by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:02 devstack nova-compute[86443]: INFO nova.compute.manager [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Detaching volume 89981085-b498-4d17-81ff-e5587dae6bb7 Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Successfully updated port: 6e136c5c-f540-4142-8d9a-d33072d905cc {{(pid=86443) _update_port /opt/stack/nova/nova/network/neutron.py:567}} Mai 07 19:37:02 devstack nova-compute[86443]: INFO nova.virt.block_device [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Attempting to driver detach volume 89981085-b498-4d17-81ff-e5587dae6bb7 from mountpoint /dev/vdb Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Found disk vdb by alias ua-89981085-b498-4d17-81ff-e5587dae6bb7 {{(pid=86443) _get_guest_disk_device /opt/stack/nova/nova/virt/libvirt/driver.py:2892}} Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Found disk vdb by alias ua-89981085-b498-4d17-81ff-e5587dae6bb7 {{(pid=86443) _get_guest_disk_device /opt/stack/nova/nova/virt/libvirt/driver.py:2892}} Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Attempting to detach device vdb from instance 21c9b24e-fc56-4f69-8903-64182d970d61 from the persistent domain config. {{(pid=86443) _detach_from_persistent /opt/stack/nova/nova/virt/libvirt/driver.py:2642}} Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] detach device xml: Mai 07 19:37:02 devstack nova-compute[86443]: Mai 07 19:37:02 devstack nova-compute[86443]: Mai 07 19:37:02 devstack nova-compute[86443]: Mai 07 19:37:02 devstack nova-compute[86443]: Mai 07 19:37:02 devstack nova-compute[86443]: 89981085-b498-4d17-81ff-e5587dae6bb7 Mai 07 19:37:02 devstack nova-compute[86443]:
Mai 07 19:37:02 devstack nova-compute[86443]: Mai 07 19:37:02 devstack nova-compute[86443]: {{(pid=86443) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:481}} Mai 07 19:37:02 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Successfully detached device vdb from instance 21c9b24e-fc56-4f69-8903-64182d970d61 from the persistent domain config. Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] (1/8): Attempting to detach device vdb with device alias ua-89981085-b498-4d17-81ff-e5587dae6bb7 from instance 21c9b24e-fc56-4f69-8903-64182d970d61 from the live domain config. {{(pid=86443) _detach_from_live_with_retry /opt/stack/nova/nova/virt/libvirt/driver.py:2676}} Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] detach device xml: Mai 07 19:37:02 devstack nova-compute[86443]: Mai 07 19:37:02 devstack nova-compute[86443]: Mai 07 19:37:02 devstack nova-compute[86443]: Mai 07 19:37:02 devstack nova-compute[86443]: Mai 07 19:37:02 devstack nova-compute[86443]: 89981085-b498-4d17-81ff-e5587dae6bb7 Mai 07 19:37:02 devstack nova-compute[86443]:
Mai 07 19:37:02 devstack nova-compute[86443]: Mai 07 19:37:02 devstack nova-compute[86443]: {{(pid=86443) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:481}} Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-cb3dd8d6-fca4-4a4b-9f75-cc3431d9dc98 req-a45c256f-6dcf-4673-aa63-b397aaf5c379 service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Received event network-changed-6e136c5c-f540-4142-8d9a-d33072d905cc {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-cb3dd8d6-fca4-4a4b-9f75-cc3431d9dc98 req-a45c256f-6dcf-4673-aa63-b397aaf5c379 service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Refreshing instance network info cache due to event network-changed-6e136c5c-f540-4142-8d9a-d33072d905cc. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-cb3dd8d6-fca4-4a4b-9f75-cc3431d9dc98 req-a45c256f-6dcf-4673-aa63-b397aaf5c379 service nova] Acquiring lock "refresh_cache-6fc39b29-fdc4-4758-ad2d-2cef146465ac" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-cb3dd8d6-fca4-4a4b-9f75-cc3431d9dc98 req-a45c256f-6dcf-4673-aa63-b397aaf5c379 service nova] Acquired lock "refresh_cache-6fc39b29-fdc4-4758-ad2d-2cef146465ac" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-cb3dd8d6-fca4-4a4b-9f75-cc3431d9dc98 req-a45c256f-6dcf-4673-aa63-b397aaf5c379 service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Refreshing network info cache for port 6e136c5c-f540-4142-8d9a-d33072d905cc {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Received event ua-89981085-b498-4d17-81ff-e5587dae6bb7> from libvirt while the driver is waiting for it; dispatched. {{(pid=86443) emit_event /opt/stack/nova/nova/virt/libvirt/driver.py:2529}} Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Start waiting for the detach event from libvirt for device vdb with device alias ua-89981085-b498-4d17-81ff-e5587dae6bb7 for instance 21c9b24e-fc56-4f69-8903-64182d970d61 {{(pid=86443) _detach_from_live_and_wait_for_event /opt/stack/nova/nova/virt/libvirt/driver.py:2756}} Mai 07 19:37:02 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Successfully detached device vdb from instance 21c9b24e-fc56-4f69-8903-64182d970d61 from the live domain config. Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:02 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Couldn't unmount the Quobyte Volume at /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command. Mai 07 19:37:02 devstack nova-compute[86443]: Command: umount /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22 Mai 07 19:37:02 devstack nova-compute[86443]: Exit code: 32 Mai 07 19:37:02 devstack nova-compute[86443]: Stdout: '' Mai 07 19:37:02 devstack nova-compute[86443]: Stderr: 'umount: /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22: target is busy.\n' Mai 07 19:37:02 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Traceback (most recent call last): Mai 07 19:37:02 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte File "/opt/stack/nova/nova/virt/libvirt/volume/quobyte.py", line 96, in umount_volume Mai 07 19:37:02 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte nova.privsep.libvirt.umount(mnt_base) Mai 07 19:37:02 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_privsep/priv_context.py", line 315, in _wrap Mai 07 19:37:02 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte return self.channel.remote_call(name, args, kwargs, r_call_timeout) Mai 07 19:37:02 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:37:02 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_privsep/daemon.py", line 262, in remote_call Mai 07 19:37:02 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte raise exc_type(*result[2]) Mai 07 19:37:02 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command. Mai 07 19:37:02 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Command: umount /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22 Mai 07 19:37:02 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Exit code: 32 Mai 07 19:37:02 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Stdout: '' Mai 07 19:37:02 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Stderr: 'umount: /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22: target is busy.\n' Mai 07 19:37:02 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: held 0.023s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "refresh_cache-6fc39b29-fdc4-4758-ad2d-2cef146465ac" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:37:03 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-cb3dd8d6-fca4-4a4b-9f75-cc3431d9dc98 req-a45c256f-6dcf-4673-aa63-b397aaf5c379 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:37:03 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-cb3dd8d6-fca4-4a4b-9f75-cc3431d9dc98 req-a45c256f-6dcf-4673-aa63-b397aaf5c379 service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:37:03 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lazy-loading 'flavor' on Instance uuid 21c9b24e-fc56-4f69-8903-64182d970d61 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:37:03 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-cb3dd8d6-fca4-4a4b-9f75-cc3431d9dc98 req-a45c256f-6dcf-4673-aa63-b397aaf5c379 service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:37:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-cb3dd8d6-fca4-4a4b-9f75-cc3431d9dc98 req-a45c256f-6dcf-4673-aa63-b397aaf5c379 service nova] Releasing lock "refresh_cache-6fc39b29-fdc4-4758-ad2d-2cef146465ac" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:37:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquired lock "refresh_cache-6fc39b29-fdc4-4758-ad2d-2cef146465ac" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:37:04 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:37:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "21c9b24e-fc56-4f69-8903-64182d970d61" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-e144ed94-ca3a-4ec7-af2f-216368253517 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61" "released" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: held 2.662s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61" acquired by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: waited 0.056s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:04 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:37:04 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:37:05 devstack nova-compute[86443]: INFO nova.compute.manager [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Detaching volume 4ca90cda-cd50-4fe8-abad-c1107b00e6d8 Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Updating instance_info_cache with network_info: [{"id": "6e136c5c-f540-4142-8d9a-d33072d905cc", "address": "fa:16:3e:84:cf:5e", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e136c5c-f5", "ovs_interfaceid": "6e136c5c-f540-4142-8d9a-d33072d905cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:37:05 devstack nova-compute[86443]: INFO nova.virt.block_device [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Attempting to driver detach volume 4ca90cda-cd50-4fe8-abad-c1107b00e6d8 from mountpoint /dev/vdc Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Found disk vdc by alias ua-4ca90cda-cd50-4fe8-abad-c1107b00e6d8 {{(pid=86443) _get_guest_disk_device /opt/stack/nova/nova/virt/libvirt/driver.py:2892}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Found disk vdc by alias ua-4ca90cda-cd50-4fe8-abad-c1107b00e6d8 {{(pid=86443) _get_guest_disk_device /opt/stack/nova/nova/virt/libvirt/driver.py:2892}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Attempting to detach device vdc from instance 21c9b24e-fc56-4f69-8903-64182d970d61 from the persistent domain config. {{(pid=86443) _detach_from_persistent /opt/stack/nova/nova/virt/libvirt/driver.py:2642}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] detach device xml: Mai 07 19:37:05 devstack nova-compute[86443]: Mai 07 19:37:05 devstack nova-compute[86443]: Mai 07 19:37:05 devstack nova-compute[86443]: Mai 07 19:37:05 devstack nova-compute[86443]: Mai 07 19:37:05 devstack nova-compute[86443]: 4ca90cda-cd50-4fe8-abad-c1107b00e6d8 Mai 07 19:37:05 devstack nova-compute[86443]:
Mai 07 19:37:05 devstack nova-compute[86443]: Mai 07 19:37:05 devstack nova-compute[86443]: {{(pid=86443) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:481}} Mai 07 19:37:05 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Successfully detached device vdc from instance 21c9b24e-fc56-4f69-8903-64182d970d61 from the persistent domain config. Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] (1/8): Attempting to detach device vdc with device alias ua-4ca90cda-cd50-4fe8-abad-c1107b00e6d8 from instance 21c9b24e-fc56-4f69-8903-64182d970d61 from the live domain config. {{(pid=86443) _detach_from_live_with_retry /opt/stack/nova/nova/virt/libvirt/driver.py:2676}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] detach device xml: Mai 07 19:37:05 devstack nova-compute[86443]: Mai 07 19:37:05 devstack nova-compute[86443]: Mai 07 19:37:05 devstack nova-compute[86443]: Mai 07 19:37:05 devstack nova-compute[86443]: Mai 07 19:37:05 devstack nova-compute[86443]: 4ca90cda-cd50-4fe8-abad-c1107b00e6d8 Mai 07 19:37:05 devstack nova-compute[86443]:
Mai 07 19:37:05 devstack nova-compute[86443]: Mai 07 19:37:05 devstack nova-compute[86443]: {{(pid=86443) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:481}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Received event ua-4ca90cda-cd50-4fe8-abad-c1107b00e6d8> from libvirt while the driver is waiting for it; dispatched. {{(pid=86443) emit_event /opt/stack/nova/nova/virt/libvirt/driver.py:2529}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Start waiting for the detach event from libvirt for device vdc with device alias ua-4ca90cda-cd50-4fe8-abad-c1107b00e6d8 for instance 21c9b24e-fc56-4f69-8903-64182d970d61 {{(pid=86443) _detach_from_live_and_wait_for_event /opt/stack/nova/nova/virt/libvirt/driver.py:2756}} Mai 07 19:37:05 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Successfully detached device vdc from instance 21c9b24e-fc56-4f69-8903-64182d970d61 from the live domain config. Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:05 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Couldn't unmount the Quobyte Volume at /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command. Mai 07 19:37:05 devstack nova-compute[86443]: Command: umount /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22 Mai 07 19:37:05 devstack nova-compute[86443]: Exit code: 32 Mai 07 19:37:05 devstack nova-compute[86443]: Stdout: '' Mai 07 19:37:05 devstack nova-compute[86443]: Stderr: 'umount: /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22: target is busy.\n' Mai 07 19:37:05 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Traceback (most recent call last): Mai 07 19:37:05 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte File "/opt/stack/nova/nova/virt/libvirt/volume/quobyte.py", line 96, in umount_volume Mai 07 19:37:05 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte nova.privsep.libvirt.umount(mnt_base) Mai 07 19:37:05 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_privsep/priv_context.py", line 315, in _wrap Mai 07 19:37:05 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte return self.channel.remote_call(name, args, kwargs, r_call_timeout) Mai 07 19:37:05 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Mai 07 19:37:05 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte File "/opt/stack/data/venv/lib/python3.12/site-packages/oslo_privsep/daemon.py", line 262, in remote_call Mai 07 19:37:05 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte raise exc_type(*result[2]) Mai 07 19:37:05 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command. Mai 07 19:37:05 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Command: umount /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22 Mai 07 19:37:05 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Exit code: 32 Mai 07 19:37:05 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Stdout: '' Mai 07 19:37:05 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Stderr: 'umount: /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22: target is busy.\n' Mai 07 19:37:05 devstack nova-compute[86443]: ERROR nova.virt.libvirt.volume.quobyte Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: held 0.035s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Releasing lock "refresh_cache-6fc39b29-fdc4-4758-ad2d-2cef146465ac" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Instance network_info: |[{"id": "6e136c5c-f540-4142-8d9a-d33072d905cc", "address": "fa:16:3e:84:cf:5e", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e136c5c-f5", "ovs_interfaceid": "6e136c5c-f540-4142-8d9a-d33072d905cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2163}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Start _get_guest_xml network_info=[{"id": "6e136c5c-f540-4142-8d9a-d33072d905cc", "address": "fa:16:3e:84:cf:5e", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e136c5c-f5", "ovs_interfaceid": "6e136c5c-f540-4142-8d9a-d33072d905cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'image_id': 'e8633b10-b98a-4580-90f8-3091ca40fa29'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8192}} Mai 07 19:37:05 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='e8633b10-b98a-4580-90f8-3091ca40fa29', instance_meta=NovaInstanceMeta(name='tempest-AttachVolumeNegativeTest-server-1018181458', uuid='6fc39b29-fdc4-4758-ad2d-2cef146465ac'), owner=OwnerMeta(userid='d98e17081ef14e01bf76138813a4d56a', username='tempest-AttachVolumeNegativeTest-429871213-project-member', projectid='b1ea2fed9f654419a4de1a6168d279ab', projectname='tempest-AttachVolumeNegativeTest-429871213'), image=ImageMeta(id='e8633b10-b98a-4580-90f8-3091ca40fa29', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "6e136c5c-f540-4142-8d9a-d33072d905cc", "address": "fa:16:3e:84:cf:5e", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e136c5c-f5", "ovs_interfaceid": "6e136c5c-f540-4142-8d9a-d33072d905cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175425.6737797) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Searching host: 'devstack' for CPU controller through CGroups V1... {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1783}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CPU controller missing on host. {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1793}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Searching host: 'devstack' for CPU controller through CGroups V2... {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1802}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CPU controller found on host. {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1809}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Getting desirable topologies for flavor Flavor(created_at=2026-05-07T17:26:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=192,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=), allow threads: True {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:703}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Flavor limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:488}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Image limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:492}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Flavor pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:528}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Image pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:532}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:570}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:709}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:611}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Got 1 possible topologies {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:641}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:715}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:717}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:36:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1018181458',display_name='tempest-AttachVolumeNegativeTest-server-1018181458',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumenegativetest-server-1018181458',id=11,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEcZl4rfWa3r5hE5SMQxOCfdNzAUmYNmgxXXCInLjeV00Bemx+RXXXLcm6zQix/FmxDWcZwK/2QEHNjA7upwpTowDd4MzdehaWKo/qW+BPLek3Gw45HabAThjXLc4nfU5g==',key_name='tempest-keypair-543734304',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1ea2fed9f654419a4de1a6168d279ab',ramdisk_id='',reservation_id='r-386t498m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-429871213',owner_user_name='tempest-AttachVolumeNegativeTest-429871213-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:37:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d98e17081ef14e01bf76138813a4d56a',uuid=6fc39b29-fdc4-4758-ad2d-2cef146465ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e136c5c-f540-4142-8d9a-d33072d905cc", "address": "fa:16:3e:84:cf:5e", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e136c5c-f5", "ovs_interfaceid": "6e136c5c-f540-4142-8d9a-d33072d905cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converting VIF {"id": "6e136c5c-f540-4142-8d9a-d33072d905cc", "address": "fa:16:3e:84:cf:5e", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e136c5c-f5", "ovs_interfaceid": "6e136c5c-f540-4142-8d9a-d33072d905cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:cf:5e,bridge_name='br-int',has_traffic_filtering=True,id=6e136c5c-f540-4142-8d9a-d33072d905cc,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e136c5c-f5') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lazy-loading 'pci_devices' on Instance uuid 6fc39b29-fdc4-4758-ad2d-2cef146465ac {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:37:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lazy-loading 'flavor' on Instance uuid 21c9b24e-fc56-4f69-8903-64182d970d61 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] End _get_guest_xml xml= Mai 07 19:37:06 devstack nova-compute[86443]: 6fc39b29-fdc4-4758-ad2d-2cef146465ac Mai 07 19:37:06 devstack nova-compute[86443]: instance-0000000b Mai 07 19:37:06 devstack nova-compute[86443]: 196608 Mai 07 19:37:06 devstack nova-compute[86443]: 1 Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: tempest-AttachVolumeNegativeTest-server-1018181458 Mai 07 19:37:06 devstack nova-compute[86443]: 2026-05-07 17:37:05 Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: 192 Mai 07 19:37:06 devstack nova-compute[86443]: 1 Mai 07 19:37:06 devstack nova-compute[86443]: 0 Mai 07 19:37:06 devstack nova-compute[86443]: 0 Mai 07 19:37:06 devstack nova-compute[86443]: 1 Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: True Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: bare Mai 07 19:37:06 devstack nova-compute[86443]: qcow2 Mai 07 19:37:06 devstack nova-compute[86443]: 1 Mai 07 19:37:06 devstack nova-compute[86443]: 0 Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: virtio Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: tempest-AttachVolumeNegativeTest-429871213-project-member Mai 07 19:37:06 devstack nova-compute[86443]: tempest-AttachVolumeNegativeTest-429871213 Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: OpenStack Foundation Mai 07 19:37:06 devstack nova-compute[86443]: OpenStack Nova Mai 07 19:37:06 devstack nova-compute[86443]: 33.1.0 Mai 07 19:37:06 devstack nova-compute[86443]: 6fc39b29-fdc4-4758-ad2d-2cef146465ac Mai 07 19:37:06 devstack nova-compute[86443]: 6fc39b29-fdc4-4758-ad2d-2cef146465ac Mai 07 19:37:06 devstack nova-compute[86443]: Virtual Machine Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: hvm Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: 1 Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: /dev/urandom Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: Mai 07 19:37:06 devstack nova-compute[86443]: {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8199}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Preparing to wait for external event network-vif-plugged-6e136c5c-f540-4142-8d9a-d33072d905cc {{(pid=86443) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:306}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:36:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1018181458',display_name='tempest-AttachVolumeNegativeTest-server-1018181458',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumenegativetest-server-1018181458',id=11,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEcZl4rfWa3r5hE5SMQxOCfdNzAUmYNmgxXXCInLjeV00Bemx+RXXXLcm6zQix/FmxDWcZwK/2QEHNjA7upwpTowDd4MzdehaWKo/qW+BPLek3Gw45HabAThjXLc4nfU5g==',key_name='tempest-keypair-543734304',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1ea2fed9f654419a4de1a6168d279ab',ramdisk_id='',reservation_id='r-386t498m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-429871213',owner_user_name='tempest-AttachVolumeNegativeTest-429871213-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:37:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d98e17081ef14e01bf76138813a4d56a',uuid=6fc39b29-fdc4-4758-ad2d-2cef146465ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e136c5c-f540-4142-8d9a-d33072d905cc", "address": "fa:16:3e:84:cf:5e", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e136c5c-f5", "ovs_interfaceid": "6e136c5c-f540-4142-8d9a-d33072d905cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) plug /opt/stack/nova/nova/virt/libvirt/vif.py:763}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converting VIF {"id": "6e136c5c-f540-4142-8d9a-d33072d905cc", "address": "fa:16:3e:84:cf:5e", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e136c5c-f5", "ovs_interfaceid": "6e136c5c-f540-4142-8d9a-d33072d905cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:cf:5e,bridge_name='br-int',has_traffic_filtering=True,id=6e136c5c-f540-4142-8d9a-d33072d905cc,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e136c5c-f5') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG os_vif [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:cf:5e,bridge_name='br-int',has_traffic_filtering=True,id=6e136c5c-f540-4142-8d9a-d33072d905cc,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e136c5c-f5') {{(pid=86443) plug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:76}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '3c51ba94-7f00-5485-b044-944801894cd2', '_type': 'linux-noop'}}, row=False) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e136c5c-f5, may_exist=True, interface_attrs={}) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap6e136c5c-f5, col_values=(('qos', UUID('3e099fe0-8594-4223-ad70-aca1bcb66afd')),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap6e136c5c-f5, col_values=(('external_ids', {'iface-id': '6e136c5c-f540-4142-8d9a-d33072d905cc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:84:cf:5e', 'vm-uuid': '6fc39b29-fdc4-4758-ad2d-2cef146465ac'}),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:37:06 devstack nova-compute[86443]: INFO os_vif [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:cf:5e,bridge_name='br-int',has_traffic_filtering=True,id=6e136c5c-f540-4142-8d9a-d33072d905cc,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e136c5c-f5') Mai 07 19:37:07 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-640b3464-fa22-4827-905b-0cecb7f2e6a9 tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61" "released" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: held 2.662s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:07 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:37:07 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] No VIF found with MAC fa:16:3e:84:cf:5e, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:37:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:08 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:08 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-ef03f7a0-335d-4a87-850b-9c3a07c6d4ed req-55ecbf54-36f3-42ba-be31-d29521484a2e service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Received event network-vif-plugged-6e136c5c-f540-4142-8d9a-d33072d905cc {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-ef03f7a0-335d-4a87-850b-9c3a07c6d4ed req-55ecbf54-36f3-42ba-be31-d29521484a2e service nova] Acquiring lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-ef03f7a0-335d-4a87-850b-9c3a07c6d4ed req-55ecbf54-36f3-42ba-be31-d29521484a2e service nova] Lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-ef03f7a0-335d-4a87-850b-9c3a07c6d4ed req-55ecbf54-36f3-42ba-be31-d29521484a2e service nova] Lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:08 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-ef03f7a0-335d-4a87-850b-9c3a07c6d4ed req-55ecbf54-36f3-42ba-be31-d29521484a2e service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Processing event network-vif-plugged-6e136c5c-f540-4142-8d9a-d33072d905cc {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11746}} Mai 07 19:37:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "21c9b24e-fc56-4f69-8903-64182d970d61" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:08 devstack nova-compute[86443]: INFO nova.compute.manager [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Terminating instance Mai 07 19:37:09 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Start destroying the instance on the hypervisor. {{(pid=86443) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3332}} Mai 07 19:37:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:09 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:09 devstack nova-compute[86443]: DEBUG nova.utils [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Queued Task(fn=>, remaining_delay=14.999268622999807 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:09 devstack nova-compute[86443]: DEBUG nova.utils [-] Received Task(fn=>, remaining_delay=14.988612800999817 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:09 devstack nova-compute[86443]: DEBUG nova.utils [-] Waitig for the deadline of Task(fn=>, remaining_delay=14.988070831999721 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:09 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Instance destroyed successfully. Mai 07 19:37:09 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lazy-loading 'resources' on Instance uuid 21c9b24e-fc56-4f69-8903-64182d970d61 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:35:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-188183399',display_name='tempest-AttachVolumeTestJSON-server-188183399',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumetestjson-server-188183399',id=8,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN8OXI61bvF6LpRgL4kB2SDd/pZ30L3qMQl3w6m3DhKbattCPpy9zT8lnp0BIdJjUt5MpeDQUs20HqqKtjuO1gnX4HzRErmxbta8CaW+KlLMe5mn0Gu9ZHEsPebf530uPw==',key_name='tempest-keypair-1557371974',keypairs=,launch_index=0,launched_at=2026-05-07T17:36:12Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='5de43333e69940d19777adfbbc96190f',ramdisk_id='',reservation_id='r-w3anivb0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-883651288',owner_user_name='tempest-AttachVolumeTestJSON-883651288-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2026-05-07T17:36:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0b98555d63743c09b6d040a46f0ecdc',uuid=21c9b24e-fc56-4f69-8903-64182d970d61,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "address": "fa:16:3e:0c:05:04", "network": {"id": "70b0bd16-afc8-4311-9822-4d1e650336f4", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2112651691-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5de43333e69940d19777adfbbc96190f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1e26fe2-82", "ovs_interfaceid": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:881}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Converting VIF {"id": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "address": "fa:16:3e:0c:05:04", "network": {"id": "70b0bd16-afc8-4311-9822-4d1e650336f4", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2112651691-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5de43333e69940d19777adfbbc96190f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1e26fe2-82", "ovs_interfaceid": "d1e26fe2-8285-4e9b-9bcc-7acb4ff87129", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0c:05:04,bridge_name='br-int',has_traffic_filtering=True,id=d1e26fe2-8285-4e9b-9bcc-7acb4ff87129,network=Network(70b0bd16-afc8-4311-9822-4d1e650336f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1e26fe2-82') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG os_vif [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0c:05:04,bridge_name='br-int',has_traffic_filtering=True,id=d1e26fe2-8285-4e9b-9bcc-7acb4ff87129,network=Network(70b0bd16-afc8-4311-9822-4d1e650336f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1e26fe2-82') {{(pid=86443) unplug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:109}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=86443) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:601}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Started> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:37:10 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] VM Started (Lifecycle Event) Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1e26fe2-82, bridge=br-int, if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Guest created on hypervisor {{(pid=86443) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4893}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=cd656662-458a-46bf-b91b-dc2637033667) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:37:10 devstack nova-compute[86443]: INFO os_vif [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0c:05:04,bridge_name='br-int',has_traffic_filtering=True,id=d1e26fe2-8285-4e9b-9bcc-7acb4ff87129,network=Network(70b0bd16-afc8-4311-9822-4d1e650336f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1e26fe2-82') Mai 07 19:37:10 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Deleting instance files /opt/stack/data/nova/instances/21c9b24e-fc56-4f69-8903-64182d970d61_del Mai 07 19:37:10 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Deletion of /opt/stack/data/nova/instances/21c9b24e-fc56-4f69-8903-64182d970d61_del complete Mai 07 19:37:10 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Instance spawned successfully. Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1012}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Acquiring lock "b1071faa-5a0b-4d52-80a4-eee09d394886" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "b1071faa-5a0b-4d52-80a4-eee09d394886" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Acquiring lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:10 devstack nova-compute[86443]: INFO nova.compute.manager [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Terminating instance Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:37:10 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:37:11 devstack nova-compute[86443]: INFO nova.compute.manager [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Took 1.60 seconds to destroy the instance on the hypervisor. Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG oslo.service.backend._common.loopingcall [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=86443) func /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_common/loopingcall.py:419}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [-] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Deallocating network for instance {{(pid=86443) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2456}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] deallocate_for_instance() {{(pid=86443) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1842}} Mai 07 19:37:11 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Found default for hw_cdrom_bus of ide {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Found default for hw_disk_bus of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Found default for hw_input_bus of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Found default for hw_pointer_model of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Found default for hw_video_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Found default for hw_vif_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Start destroying the instance on the hypervisor. {{(pid=86443) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3332}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:11 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] During sync_power_state the instance has a pending task (spawning). Skip. Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Paused> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:37:11 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] VM Paused (Lifecycle Event) Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:11 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:11 devstack nova-compute[86443]: INFO nova.compute.manager [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Took 10.81 seconds to spawn the instance on the hypervisor. Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:37:11 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Instance destroyed successfully. Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lazy-loading 'resources' on Instance uuid b1071faa-5a0b-4d52-80a4-eee09d394886 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-2e62acee-e95f-438e-a050-3f40a30e5ae2 req-1106dcde-376a-4026-87d8-f73de84f6a32 service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Received event network-vif-unplugged-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-2e62acee-e95f-438e-a050-3f40a30e5ae2 req-1106dcde-376a-4026-87d8-f73de84f6a32 service nova] Acquiring lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-2e62acee-e95f-438e-a050-3f40a30e5ae2 req-1106dcde-376a-4026-87d8-f73de84f6a32 service nova] Lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-2e62acee-e95f-438e-a050-3f40a30e5ae2 req-1106dcde-376a-4026-87d8-f73de84f6a32 service nova] Lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-2e62acee-e95f-438e-a050-3f40a30e5ae2 req-1106dcde-376a-4026-87d8-f73de84f6a32 service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] No waiting events found dispatching network-vif-unplugged-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:37:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-2e62acee-e95f-438e-a050-3f40a30e5ae2 req-1106dcde-376a-4026-87d8-f73de84f6a32 service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Received event network-vif-unplugged-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-2efed700-0a02-48ed-afbc-40624608be7c req-876b9901-4e3d-4cf8-86b0-e9b2cd0fc316 service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Received event network-vif-plugged-6e136c5c-f540-4142-8d9a-d33072d905cc {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-2efed700-0a02-48ed-afbc-40624608be7c req-876b9901-4e3d-4cf8-86b0-e9b2cd0fc316 service nova] Acquiring lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-2efed700-0a02-48ed-afbc-40624608be7c req-876b9901-4e3d-4cf8-86b0-e9b2cd0fc316 service nova] Lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-2efed700-0a02-48ed-afbc-40624608be7c req-876b9901-4e3d-4cf8-86b0-e9b2cd0fc316 service nova] Lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.005s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-2efed700-0a02-48ed-afbc-40624608be7c req-876b9901-4e3d-4cf8-86b0-e9b2cd0fc316 service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] No waiting events found dispatching network-vif-plugged-6e136c5c-f540-4142-8d9a-d33072d905cc {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:37:12 devstack nova-compute[86443]: WARNING nova.compute.manager [req-2efed700-0a02-48ed-afbc-40624608be7c req-876b9901-4e3d-4cf8-86b0-e9b2cd0fc316 service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Received unexpected event network-vif-plugged-6e136c5c-f540-4142-8d9a-d33072d905cc for instance with vm_state building and task_state spawning. Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Resumed> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:37:12 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] VM Resumed (Lifecycle Event) Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:36:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1121917553',display_name='tempest-VolumesAdminNegativeTest-server-1121917553',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-volumesadminnegativetest-server-1121917553',id=10,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-05-07T17:37:00Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='0ac6bf62a8e14b69b4016786de05abd9',ramdisk_id='',reservation_id='r-7fdt4d9y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-915880522',owner_user_name='tempest-VolumesAdminNegativeTest-915880522-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2026-05-07T17:37:00Z,user_data=None,user_id='985731d9531c4316a1bc342b3bf7e8ce',uuid=b1071faa-5a0b-4d52-80a4-eee09d394886,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "address": "fa:16:3e:d1:33:81", "network": {"id": "c2d4c0f6-c2c9-48b7-9c2a-f6fc10a00679", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-133065592-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0ac6bf62a8e14b69b4016786de05abd9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde56e1ed-66", "ovs_interfaceid": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:881}} Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Converting VIF {"id": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "address": "fa:16:3e:d1:33:81", "network": {"id": "c2d4c0f6-c2c9-48b7-9c2a-f6fc10a00679", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-133065592-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0ac6bf62a8e14b69b4016786de05abd9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde56e1ed-66", "ovs_interfaceid": "de56e1ed-660f-4a87-9a44-b7a9976c6dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:33:81,bridge_name='br-int',has_traffic_filtering=True,id=de56e1ed-660f-4a87-9a44-b7a9976c6dd7,network=Network(c2d4c0f6-c2c9-48b7-9c2a-f6fc10a00679),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde56e1ed-66') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG os_vif [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:33:81,bridge_name='br-int',has_traffic_filtering=True,id=de56e1ed-660f-4a87-9a44-b7a9976c6dd7,network=Network(c2d4c0f6-c2c9-48b7-9c2a-f6fc10a00679),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde56e1ed-66') {{(pid=86443) unplug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:109}} Mai 07 19:37:12 devstack nova-compute[86443]: INFO nova.compute.manager [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Took 16.31 seconds to build instance. Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde56e1ed-66, bridge=br-int, if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=cd7dfde7-80d6-4af3-ab7e-534fe9196cc5) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:37:12 devstack nova-compute[86443]: INFO os_vif [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:33:81,bridge_name='br-int',has_traffic_filtering=True,id=de56e1ed-660f-4a87-9a44-b7a9976c6dd7,network=Network(c2d4c0f6-c2c9-48b7-9c2a-f6fc10a00679),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde56e1ed-66') Mai 07 19:37:12 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Deleting instance files /opt/stack/data/nova/instances/b1071faa-5a0b-4d52-80a4-eee09d394886_del Mai 07 19:37:12 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Deletion of /opt/stack/data/nova/instances/b1071faa-5a0b-4d52-80a4-eee09d394886_del complete Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.837s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:12 devstack nova-compute[86443]: INFO nova.compute.manager [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Took 1.59 seconds to destroy the instance on the hypervisor. Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG oslo.service.backend._common.loopingcall [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=86443) func /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_common/loopingcall.py:419}} Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG nova.compute.manager [-] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Deallocating network for instance {{(pid=86443) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2456}} Mai 07 19:37:12 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] deallocate_for_instance() {{(pid=86443) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1842}} Mai 07 19:37:12 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:37:12 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:37:13 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:37:13 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:37:13 devstack nova-compute[86443]: DEBUG nova.utils [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Queued Task(fn=>, remaining_delay=14.999441270000261 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Received event network-vif-unplugged-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] Acquiring lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] Lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] Lock "21c9b24e-fc56-4f69-8903-64182d970d61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] No waiting events found dispatching network-vif-unplugged-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Received event network-vif-unplugged-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Received event network-vif-unplugged-de56e1ed-660f-4a87-9a44-b7a9976c6dd7 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] Acquiring lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] Lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] Lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] No waiting events found dispatching network-vif-unplugged-de56e1ed-660f-4a87-9a44-b7a9976c6dd7 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Received event network-vif-unplugged-de56e1ed-660f-4a87-9a44-b7a9976c6dd7 for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Received event network-vif-unplugged-de56e1ed-660f-4a87-9a44-b7a9976c6dd7 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] Acquiring lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] Lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] Lock "b1071faa-5a0b-4d52-80a4-eee09d394886-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] No waiting events found dispatching network-vif-unplugged-de56e1ed-660f-4a87-9a44-b7a9976c6dd7 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-bfc82d65-e36f-42af-8983-21f8e7a98966 req-d81ca536-fe82-42d6-b341-818237215a6c service nova] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Received event network-vif-unplugged-de56e1ed-660f-4a87-9a44-b7a9976c6dd7 for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-d5f8de46-080c-4eff-838f-cf4fd680d798 req-7dc769c9-5a4b-4b5e-9c61-cf6bc2cb6b4d service nova] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Received event network-vif-deleted-de56e1ed-660f-4a87-9a44-b7a9976c6dd7 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:14 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Took 1.71 seconds to deallocate network for instance. Mai 07 19:37:14 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:37:15 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:15 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:15 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Took 4.24 seconds to deallocate network for instance. Mai 07 19:37:15 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:37:15 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:37:15 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:15 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:37:16 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-061e8e41-8393-437d-add4-ef3ac5eec0c4 req-d4bf9dd5-d23f-4700-b436-0df284d58d96 service nova] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Received event network-vif-deleted-d1e26fe2-8285-4e9b-9bcc-7acb4ff87129 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:16 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-061e8e41-8393-437d-add4-ef3ac5eec0c4 req-d4bf9dd5-d23f-4700-b436-0df284d58d96 service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Received event network-changed-6e136c5c-f540-4142-8d9a-d33072d905cc {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:16 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-061e8e41-8393-437d-add4-ef3ac5eec0c4 req-d4bf9dd5-d23f-4700-b436-0df284d58d96 service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Refreshing instance network info cache due to event network-changed-6e136c5c-f540-4142-8d9a-d33072d905cc. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:37:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-061e8e41-8393-437d-add4-ef3ac5eec0c4 req-d4bf9dd5-d23f-4700-b436-0df284d58d96 service nova] Acquiring lock "refresh_cache-6fc39b29-fdc4-4758-ad2d-2cef146465ac" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:37:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-061e8e41-8393-437d-add4-ef3ac5eec0c4 req-d4bf9dd5-d23f-4700-b436-0df284d58d96 service nova] Acquired lock "refresh_cache-6fc39b29-fdc4-4758-ad2d-2cef146465ac" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:37:16 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-061e8e41-8393-437d-add4-ef3ac5eec0c4 req-d4bf9dd5-d23f-4700-b436-0df284d58d96 service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Refreshing network info cache for port 6e136c5c-f540-4142-8d9a-d33072d905cc {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:37:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.275s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.548s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:16 devstack nova-compute[86443]: INFO nova.scheduler.client.report [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Deleted allocations for instance b1071faa-5a0b-4d52-80a4-eee09d394886 Mai 07 19:37:16 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:37:16 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:37:16 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-061e8e41-8393-437d-add4-ef3ac5eec0c4 req-d4bf9dd5-d23f-4700-b436-0df284d58d96 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:37:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:17 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:37:17 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-061e8e41-8393-437d-add4-ef3ac5eec0c4 req-d4bf9dd5-d23f-4700-b436-0df284d58d96 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:37:17 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:17 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-061e8e41-8393-437d-add4-ef3ac5eec0c4 req-d4bf9dd5-d23f-4700-b436-0df284d58d96 service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Updated VIF entry in instance network info cache for port 6e136c5c-f540-4142-8d9a-d33072d905cc. {{(pid=86443) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3521}} Mai 07 19:37:17 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-061e8e41-8393-437d-add4-ef3ac5eec0c4 req-d4bf9dd5-d23f-4700-b436-0df284d58d96 service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Updating instance_info_cache with network_info: [{"id": "6e136c5c-f540-4142-8d9a-d33072d905cc", "address": "fa:16:3e:84:cf:5e", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.3", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e136c5c-f5", "ovs_interfaceid": "6e136c5c-f540-4142-8d9a-d33072d905cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:37:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-03ca8efe-5aab-4e5b-89ba-aa85ab2abecf tempest-VolumesAdminNegativeTest-915880522 tempest-VolumesAdminNegativeTest-915880522-project-member] Lock "b1071faa-5a0b-4d52-80a4-eee09d394886" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 6.764s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.207s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:17 devstack nova-compute[86443]: INFO nova.scheduler.client.report [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Deleted allocations for instance 21c9b24e-fc56-4f69-8903-64182d970d61 Mai 07 19:37:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-061e8e41-8393-437d-add4-ef3ac5eec0c4 req-d4bf9dd5-d23f-4700-b436-0df284d58d96 service nova] Releasing lock "refresh_cache-6fc39b29-fdc4-4758-ad2d-2cef146465ac" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:37:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-a522fe17-efa6-4865-8154-1fd2a916a17b tempest-AttachVolumeTestJSON-883651288 tempest-AttachVolumeTestJSON-883651288-project-member] Lock "21c9b24e-fc56-4f69-8903-64182d970d61" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 9.801s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:21 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:21 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:24 devstack nova-compute[86443]: DEBUG nova.utils [-] Task(fn=>, remaining_delay=-0.0043787849999716855 future=) submitted to {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:24 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] Emitting event Stopped> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:37:24 devstack nova-compute[86443]: INFO nova.compute.manager [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] VM Stopped (Lifecycle Event) Mai 07 19:37:24 devstack nova-compute[86443]: DEBUG nova.utils [-] Waiting for the next task {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:24 devstack nova-compute[86443]: DEBUG nova.utils [-] Received Task(fn=>, remaining_delay=3.7212746660002267 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:24 devstack nova-compute[86443]: DEBUG nova.utils [-] Waitig for the deadline of Task(fn=>, remaining_delay=3.720898361000309 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:25 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-a745fd5c-972a-40a8-8a2e-11d058f69c78 tempest-ServersNegativeTestJSON-1436591115 tempest-ServersNegativeTestJSON-1436591115-project-member] [instance: 21c9b24e-fc56-4f69-8903-64182d970d61] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:37:26 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:26 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:27 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:28 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:28 devstack nova-compute[86443]: DEBUG nova.utils [-] Task(fn=>, remaining_delay=-0.0013731659996665257 future=) submitted to {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:28 devstack nova-compute[86443]: DEBUG nova.utils [-] Waiting for the next task {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:28 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Emitting event Stopped> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:37:28 devstack nova-compute[86443]: INFO nova.compute.manager [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] VM Stopped (Lifecycle Event) Mai 07 19:37:29 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-78fdd882-6803-4176-886d-4b85a102a690 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: b1071faa-5a0b-4d52-80a4-eee09d394886] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:37:30 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:31 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:32 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:36 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:36 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:36 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:36 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:36 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:36 devstack nova-compute[86443]: INFO nova.compute.manager [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Terminating instance Mai 07 19:37:36 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:36 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Start destroying the instance on the hypervisor. {{(pid=86443) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3332}} Mai 07 19:37:36 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:36 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG nova.utils [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Queued Task(fn=>, remaining_delay=14.999492393999844 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG nova.utils [-] Received Task(fn=>, remaining_delay=14.997473701999752 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG nova.utils [-] Waitig for the deadline of Task(fn=>, remaining_delay=14.99699207599997 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:37 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Instance destroyed successfully. Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lazy-loading 'resources' on Instance uuid 6fc39b29-fdc4-4758-ad2d-2cef146465ac {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-98a68639-c3b0-4ddf-8864-292a87a4b2ad req-38f02422-bdfb-4918-8ec4-6f313f15ac8d service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Received event network-vif-unplugged-6e136c5c-f540-4142-8d9a-d33072d905cc {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-98a68639-c3b0-4ddf-8864-292a87a4b2ad req-38f02422-bdfb-4918-8ec4-6f313f15ac8d service nova] Acquiring lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-98a68639-c3b0-4ddf-8864-292a87a4b2ad req-38f02422-bdfb-4918-8ec4-6f313f15ac8d service nova] Lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-98a68639-c3b0-4ddf-8864-292a87a4b2ad req-38f02422-bdfb-4918-8ec4-6f313f15ac8d service nova] Lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-98a68639-c3b0-4ddf-8864-292a87a4b2ad req-38f02422-bdfb-4918-8ec4-6f313f15ac8d service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] No waiting events found dispatching network-vif-unplugged-6e136c5c-f540-4142-8d9a-d33072d905cc {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-98a68639-c3b0-4ddf-8864-292a87a4b2ad req-38f02422-bdfb-4918-8ec4-6f313f15ac8d service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Received event network-vif-unplugged-6e136c5c-f540-4142-8d9a-d33072d905cc for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:36:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1018181458',display_name='tempest-AttachVolumeNegativeTest-server-1018181458',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumenegativetest-server-1018181458',id=11,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEcZl4rfWa3r5hE5SMQxOCfdNzAUmYNmgxXXCInLjeV00Bemx+RXXXLcm6zQix/FmxDWcZwK/2QEHNjA7upwpTowDd4MzdehaWKo/qW+BPLek3Gw45HabAThjXLc4nfU5g==',key_name='tempest-keypair-543734304',keypairs=,launch_index=0,launched_at=2026-05-07T17:37:11Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b1ea2fed9f654419a4de1a6168d279ab',ramdisk_id='',reservation_id='r-386t498m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-429871213',owner_user_name='tempest-AttachVolumeNegativeTest-429871213-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2026-05-07T17:37:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d98e17081ef14e01bf76138813a4d56a',uuid=6fc39b29-fdc4-4758-ad2d-2cef146465ac,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e136c5c-f540-4142-8d9a-d33072d905cc", "address": "fa:16:3e:84:cf:5e", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.3", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e136c5c-f5", "ovs_interfaceid": "6e136c5c-f540-4142-8d9a-d33072d905cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:881}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converting VIF {"id": "6e136c5c-f540-4142-8d9a-d33072d905cc", "address": "fa:16:3e:84:cf:5e", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.3", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e136c5c-f5", "ovs_interfaceid": "6e136c5c-f540-4142-8d9a-d33072d905cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:84:cf:5e,bridge_name='br-int',has_traffic_filtering=True,id=6e136c5c-f540-4142-8d9a-d33072d905cc,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e136c5c-f5') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG os_vif [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:cf:5e,bridge_name='br-int',has_traffic_filtering=True,id=6e136c5c-f540-4142-8d9a-d33072d905cc,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e136c5c-f5') {{(pid=86443) unplug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:109}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e136c5c-f5, bridge=br-int, if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=3e099fe0-8594-4223-ad70-aca1bcb66afd) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:37:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:37 devstack nova-compute[86443]: INFO os_vif [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:cf:5e,bridge_name='br-int',has_traffic_filtering=True,id=6e136c5c-f540-4142-8d9a-d33072d905cc,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e136c5c-f5') Mai 07 19:37:37 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Deleting instance files /opt/stack/data/nova/instances/6fc39b29-fdc4-4758-ad2d-2cef146465ac_del Mai 07 19:37:37 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Deletion of /opt/stack/data/nova/instances/6fc39b29-fdc4-4758-ad2d-2cef146465ac_del complete Mai 07 19:37:38 devstack nova-compute[86443]: INFO nova.compute.manager [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Took 1.34 seconds to destroy the instance on the hypervisor. Mai 07 19:37:38 devstack nova-compute[86443]: DEBUG oslo.service.backend._common.loopingcall [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=86443) func /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_common/loopingcall.py:419}} Mai 07 19:37:38 devstack nova-compute[86443]: DEBUG nova.compute.manager [-] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Deallocating network for instance {{(pid=86443) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2456}} Mai 07 19:37:38 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] deallocate_for_instance() {{(pid=86443) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1842}} Mai 07 19:37:38 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:37:38 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:37:39 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-17faf7ea-f478-49ab-a696-91697a8bc580 req-8982f5d9-cb04-4bf9-a9af-e0e78fca42eb service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Received event network-vif-unplugged-6e136c5c-f540-4142-8d9a-d33072d905cc {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:39 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-17faf7ea-f478-49ab-a696-91697a8bc580 req-8982f5d9-cb04-4bf9-a9af-e0e78fca42eb service nova] Acquiring lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:39 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-17faf7ea-f478-49ab-a696-91697a8bc580 req-8982f5d9-cb04-4bf9-a9af-e0e78fca42eb service nova] Lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:39 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-17faf7ea-f478-49ab-a696-91697a8bc580 req-8982f5d9-cb04-4bf9-a9af-e0e78fca42eb service nova] Lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.004s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:39 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-17faf7ea-f478-49ab-a696-91697a8bc580 req-8982f5d9-cb04-4bf9-a9af-e0e78fca42eb service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] No waiting events found dispatching network-vif-unplugged-6e136c5c-f540-4142-8d9a-d33072d905cc {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:37:39 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-17faf7ea-f478-49ab-a696-91697a8bc580 req-8982f5d9-cb04-4bf9-a9af-e0e78fca42eb service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Received event network-vif-unplugged-6e136c5c-f540-4142-8d9a-d33072d905cc for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:37:39 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-17faf7ea-f478-49ab-a696-91697a8bc580 req-8982f5d9-cb04-4bf9-a9af-e0e78fca42eb service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Received event network-vif-deleted-6e136c5c-f540-4142-8d9a-d33072d905cc {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:39 devstack nova-compute[86443]: INFO nova.compute.manager [req-17faf7ea-f478-49ab-a696-91697a8bc580 req-8982f5d9-cb04-4bf9-a9af-e0e78fca42eb service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Neutron deleted interface 6e136c5c-f540-4142-8d9a-d33072d905cc; detaching it from the instance and deleting it from the info cache Mai 07 19:37:39 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-17faf7ea-f478-49ab-a696-91697a8bc580 req-8982f5d9-cb04-4bf9-a9af-e0e78fca42eb service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:37:39 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:37:40 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-17faf7ea-f478-49ab-a696-91697a8bc580 req-8982f5d9-cb04-4bf9-a9af-e0e78fca42eb service nova] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Detach interface failed, port_id=6e136c5c-f540-4142-8d9a-d33072d905cc, reason: Instance 6fc39b29-fdc4-4758-ad2d-2cef146465ac could not be found. {{(pid=86443) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:11820}} Mai 07 19:37:40 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Took 2.24 seconds to deallocate network for instance. Mai 07 19:37:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Acquiring lock "a862cbbd-165b-49ae-b766-9d4f72038ec3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:40 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:41 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:37:41 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:37:41 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Starting instance... {{(pid=86443) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2605}} Mai 07 19:37:41 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:37:41 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.202s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:42 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.344s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:42 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=86443) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2630}} Mai 07 19:37:42 devstack nova-compute[86443]: INFO nova.compute.claims [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Claim successful on node devstack Mai 07 19:37:42 devstack nova-compute[86443]: INFO nova.scheduler.client.report [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Deleted allocations for instance 6fc39b29-fdc4-4758-ad2d-2cef146465ac Mai 07 19:37:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:43 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-5e1ff759-918a-468b-a3f1-f1e433cee5e1 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "6fc39b29-fdc4-4758-ad2d-2cef146465ac" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 6.995s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:43 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:37:43 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:37:43 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:37:44 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:37:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=86443) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:11402}} Mai 07 19:37:44 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 1.00 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:37:44 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.230s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Start building networks asynchronously for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3003}} Mai 07 19:37:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Allocating IP information in the background. {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2148}} Mai 07 19:37:44 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] allocate_for_instance() {{(pid=86443) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1187}} Mai 07 19:37:44 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:37:44 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:37:45 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:37:45 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:37:45 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:37:45 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 0.99 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:37:45 devstack nova-compute[86443]: DEBUG nova.policy [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '30ba9350663b488b9fbe17e4811ad191', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '24e6883199094fd29e71ed13beb20723', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=86443) authorize /opt/stack/nova/nova/policy.py:192}} Mai 07 19:37:45 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Mai 07 19:37:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "0a37052d-f36d-4e75-8464-1c227b87d6e5" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:45 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5" acquired by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:45 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Start building block device mappings for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3038}} Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager.update_available_resource {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:37:46 devstack nova-compute[86443]: INFO nova.compute.manager [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Detaching volume 2d1f44c8-c445-4b0f-82aa-47a492cccc8f Mai 07 19:37:46 devstack nova-compute[86443]: INFO nova.virt.block_device [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Attempting to driver detach volume 2d1f44c8-c445-4b0f-82aa-47a492cccc8f from mountpoint /dev/vdb Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Found disk vdb by alias ua-2d1f44c8-c445-4b0f-82aa-47a492cccc8f {{(pid=86443) _get_guest_disk_device /opt/stack/nova/nova/virt/libvirt/driver.py:2892}} Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Found disk vdb by alias ua-2d1f44c8-c445-4b0f-82aa-47a492cccc8f {{(pid=86443) _get_guest_disk_device /opt/stack/nova/nova/virt/libvirt/driver.py:2892}} Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Attempting to detach device vdb from instance 0a37052d-f36d-4e75-8464-1c227b87d6e5 from the persistent domain config. {{(pid=86443) _detach_from_persistent /opt/stack/nova/nova/virt/libvirt/driver.py:2642}} Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] detach device xml: Mai 07 19:37:46 devstack nova-compute[86443]: Mai 07 19:37:46 devstack nova-compute[86443]: Mai 07 19:37:46 devstack nova-compute[86443]: Mai 07 19:37:46 devstack nova-compute[86443]: Mai 07 19:37:46 devstack nova-compute[86443]: 2d1f44c8-c445-4b0f-82aa-47a492cccc8f Mai 07 19:37:46 devstack nova-compute[86443]:
Mai 07 19:37:46 devstack nova-compute[86443]: Mai 07 19:37:46 devstack nova-compute[86443]: {{(pid=86443) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:481}} Mai 07 19:37:46 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Successfully detached device vdb from instance 0a37052d-f36d-4e75-8464-1c227b87d6e5 from the persistent domain config. Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] (1/8): Attempting to detach device vdb with device alias ua-2d1f44c8-c445-4b0f-82aa-47a492cccc8f from instance 0a37052d-f36d-4e75-8464-1c227b87d6e5 from the live domain config. {{(pid=86443) _detach_from_live_with_retry /opt/stack/nova/nova/virt/libvirt/driver.py:2676}} Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] detach device xml: Mai 07 19:37:46 devstack nova-compute[86443]: Mai 07 19:37:46 devstack nova-compute[86443]: Mai 07 19:37:46 devstack nova-compute[86443]: Mai 07 19:37:46 devstack nova-compute[86443]: Mai 07 19:37:46 devstack nova-compute[86443]: 2d1f44c8-c445-4b0f-82aa-47a492cccc8f Mai 07 19:37:46 devstack nova-compute[86443]:
Mai 07 19:37:46 devstack nova-compute[86443]: Mai 07 19:37:46 devstack nova-compute[86443]: {{(pid=86443) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:481}} Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Received event ua-2d1f44c8-c445-4b0f-82aa-47a492cccc8f> from libvirt while the driver is waiting for it; dispatched. {{(pid=86443) emit_event /opt/stack/nova/nova/virt/libvirt/driver.py:2529}} Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Start waiting for the detach event from libvirt for device vdb with device alias ua-2d1f44c8-c445-4b0f-82aa-47a492cccc8f for instance 0a37052d-f36d-4e75-8464-1c227b87d6e5 {{(pid=86443) _detach_from_live_and_wait_for_event /opt/stack/nova/nova/virt/libvirt/driver.py:2756}} Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:46 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Successfully detached device vdb from instance 0a37052d-f36d-4e75-8464-1c227b87d6e5 from the live domain config. Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: held 0.034s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Auditing locally available compute resources for devstack (node: devstack) {{(pid=86443) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:937}} Mai 07 19:37:46 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Start spawning the instance on the hypervisor. {{(pid=86443) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2811}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Creating instance directory {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5215}} Mai 07 19:37:47 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Creating image(s) Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Acquiring lock "/opt/stack/data/nova/instances/a862cbbd-165b-49ae-b766-9d4f72038ec3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "/opt/stack/data/nova/instances/a862cbbd-165b-49ae-b766-9d4f72038ec3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.005s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "/opt/stack/data/nova/instances/a862cbbd-165b-49ae-b766-9d4f72038ec3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.298s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Acquiring lock "d8d56ca44922efe85609619d01052c20f44c056a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lazy-loading 'flavor' on Instance uuid 0a37052d-f36d-4e75-8464-1c227b87d6e5 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.151s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/a862cbbd-165b-49ae-b766-9d4f72038ec3/disk 1073741824 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/a862cbbd-165b-49ae-b766-9d4f72038ec3/disk 1073741824" returned: 0 in 0.090s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.266s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.180s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Checking if we can resize image /opt/stack/data/nova/instances/a862cbbd-165b-49ae-b766-9d4f72038ec3/disk. size=1073741824 {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:178}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a862cbbd-165b-49ae-b766-9d4f72038ec3/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/disk --force-share --output=json" returned: 0 in 0.212s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:37:47 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:37:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a862cbbd-165b-49ae-b766-9d4f72038ec3/disk --force-share --output=json" returned: 0 in 0.312s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:37:48 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Cannot resize image /opt/stack/data/nova/instances/a862cbbd-165b-49ae-b766-9d4f72038ec3/disk to a smaller size. {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:184}} Mai 07 19:37:48 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Created local disks {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5347}} Mai 07 19:37:48 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Ensure instance console log exists: /opt/stack/data/nova/instances/a862cbbd-165b-49ae-b766-9d4f72038ec3/console.log {{(pid=86443) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:5094}} Mai 07 19:37:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5/disk --force-share --output=json" returned: 0 in 0.170s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:37:48 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:37:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): env LANG=C uptime {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:37:48 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Successfully created port: 47c26b58-3415-4461-a3e5-508baddcf0d1 {{(pid=86443) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:529}} Mai 07 19:37:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "env LANG=C uptime" returned: 0 in 0.074s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:37:48 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Hypervisor/Node resource view: name=devstack free_ram=5076MB free_disk=14.71097183227539GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_00_0", "address": "0000:02:00.0", "product_id": "000d", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000d", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1111", "vendor_id": "1234", "numa_node": null, "label": "label_1234_1111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1043", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1043", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] {{(pid=86443) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1136}} Mai 07 19:37:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2985aa8f-f969-4b7c-8509-35a75208ed6c tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5" "released" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: held 2.868s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:49 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Instance 0a37052d-f36d-4e75-8464-1c227b87d6e5 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}. {{(pid=86443) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1740}} Mai 07 19:37:49 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Instance a862cbbd-165b-49ae-b766-9d4f72038ec3 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}. {{(pid=86443) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1740}} Mai 07 19:37:49 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Total usable vcpus: 4, total allocated vcpus: 2 {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1159}} Mai 07 19:37:49 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Final resource view: name=devstack phys_ram=11961MB used_ram=896MB phys_disk=25GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:37:48 up 37 min, 1 user, load average: 8.17, 6.43, 3.88\n', 'num_instances': '2', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_b1ea2fed9f654419a4de1a6168d279ab': '1', 'io_workload': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_proj_24e6883199094fd29e71ed13beb20723': '1'} {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1168}} Mai 07 19:37:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "0a37052d-f36d-4e75-8464-1c227b87d6e5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:49 devstack nova-compute[86443]: INFO nova.compute.manager [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Terminating instance Mai 07 19:37:49 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:37:49 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:37:49 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Start destroying the instance on the hypervisor. {{(pid=86443) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3332}} Mai 07 19:37:49 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:37:49 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:49 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG nova.utils [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Queued Task(fn=>, remaining_delay=14.999371987999893 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:50 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Instance destroyed successfully. Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lazy-loading 'resources' on Instance uuid 0a37052d-f36d-4e75-8464-1c227b87d6e5 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Successfully updated port: 47c26b58-3415-4461-a3e5-508baddcf0d1 {{(pid=86443) _update_port /opt/stack/nova/nova/network/neutron.py:567}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-38dbdf20-5164-4cb9-98da-b30a43e342a4 req-1f797d3d-dc58-455d-919f-66d576f769fb service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Received event network-changed-47c26b58-3415-4461-a3e5-508baddcf0d1 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-38dbdf20-5164-4cb9-98da-b30a43e342a4 req-1f797d3d-dc58-455d-919f-66d576f769fb service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Refreshing instance network info cache due to event network-changed-47c26b58-3415-4461-a3e5-508baddcf0d1. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-38dbdf20-5164-4cb9-98da-b30a43e342a4 req-1f797d3d-dc58-455d-919f-66d576f769fb service nova] Acquiring lock "refresh_cache-a862cbbd-165b-49ae-b766-9d4f72038ec3" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-38dbdf20-5164-4cb9-98da-b30a43e342a4 req-1f797d3d-dc58-455d-919f-66d576f769fb service nova] Acquired lock "refresh_cache-a862cbbd-165b-49ae-b766-9d4f72038ec3" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-38dbdf20-5164-4cb9-98da-b30a43e342a4 req-1f797d3d-dc58-455d-919f-66d576f769fb service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Refreshing network info cache for port 47c26b58-3415-4461-a3e5-508baddcf0d1 {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Compute_service record updated for devstack:devstack {{(pid=86443) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1097}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.217s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 1.99 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-fe7b7f04-5f43-4ed0-92f2-dfcc793e93b1 req-586b6c0d-e74a-4cfc-80f4-4548b40e420c service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Received event network-vif-unplugged-cf792953-3429-4182-ba5e-0b8d056bc7e9 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-fe7b7f04-5f43-4ed0-92f2-dfcc793e93b1 req-586b6c0d-e74a-4cfc-80f4-4548b40e420c service nova] Acquiring lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-fe7b7f04-5f43-4ed0-92f2-dfcc793e93b1 req-586b6c0d-e74a-4cfc-80f4-4548b40e420c service nova] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-fe7b7f04-5f43-4ed0-92f2-dfcc793e93b1 req-586b6c0d-e74a-4cfc-80f4-4548b40e420c service nova] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-fe7b7f04-5f43-4ed0-92f2-dfcc793e93b1 req-586b6c0d-e74a-4cfc-80f4-4548b40e420c service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] No waiting events found dispatching network-vif-unplugged-cf792953-3429-4182-ba5e-0b8d056bc7e9 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-fe7b7f04-5f43-4ed0-92f2-dfcc793e93b1 req-586b6c0d-e74a-4cfc-80f4-4548b40e420c service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Received event network-vif-unplugged-cf792953-3429-4182-ba5e-0b8d056bc7e9 for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:35:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1692316619',display_name='tempest-AttachVolumeNegativeTest-server-1692316619',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumenegativetest-server-1692316619',id=9,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMRJyleaaRVCSygfNfYeqjuLjImZnBat2EXS6jXv1VB4Tp8h8P55r2Mi1c2YWsAqlGrpPEDV0tpOF4jJYHyqYLLH2xZ41A5TsjtEBubw66zEoAcgO9q31yy943+qFNurJw==',key_name='tempest-keypair-1060373517',keypairs=,launch_index=0,launched_at=2026-05-07T17:36:15Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b1ea2fed9f654419a4de1a6168d279ab',ramdisk_id='',reservation_id='r-4zcirlhh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-429871213',owner_user_name='tempest-AttachVolumeNegativeTest-429871213-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2026-05-07T17:36:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d98e17081ef14e01bf76138813a4d56a',uuid=0a37052d-f36d-4e75-8464-1c227b87d6e5,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "address": "fa:16:3e:fd:ad:ab", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.83", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf792953-34", "ovs_interfaceid": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:881}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converting VIF {"id": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "address": "fa:16:3e:fd:ad:ab", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.83", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf792953-34", "ovs_interfaceid": "cf792953-3429-4182-ba5e-0b8d056bc7e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fd:ad:ab,bridge_name='br-int',has_traffic_filtering=True,id=cf792953-3429-4182-ba5e-0b8d056bc7e9,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf792953-34') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG os_vif [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:ad:ab,bridge_name='br-int',has_traffic_filtering=True,id=cf792953-3429-4182-ba5e-0b8d056bc7e9,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf792953-34') {{(pid=86443) unplug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:109}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcf792953-34, bridge=br-int, if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=ccb3aaa4-1862-4b9a-9f02-7b7fcbb24a21) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Acquiring lock "refresh_cache-a862cbbd-165b-49ae-b766-9d4f72038ec3" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:37:50 devstack nova-compute[86443]: INFO os_vif [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:ad:ab,bridge_name='br-int',has_traffic_filtering=True,id=cf792953-3429-4182-ba5e-0b8d056bc7e9,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf792953-34') Mai 07 19:37:50 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Deleting instance files /opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5_del Mai 07 19:37:50 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Deletion of /opt/stack/data/nova/instances/0a37052d-f36d-4e75-8464-1c227b87d6e5_del complete Mai 07 19:37:50 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-38dbdf20-5164-4cb9-98da-b30a43e342a4 req-1f797d3d-dc58-455d-919f-66d576f769fb service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:37:50 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-38dbdf20-5164-4cb9-98da-b30a43e342a4 req-1f797d3d-dc58-455d-919f-66d576f769fb service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:37:51 devstack nova-compute[86443]: INFO nova.compute.manager [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Took 1.35 seconds to destroy the instance on the hypervisor. Mai 07 19:37:51 devstack nova-compute[86443]: DEBUG oslo.service.backend._common.loopingcall [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=86443) func /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_common/loopingcall.py:419}} Mai 07 19:37:51 devstack nova-compute[86443]: DEBUG nova.compute.manager [-] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Deallocating network for instance {{(pid=86443) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2456}} Mai 07 19:37:51 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] deallocate_for_instance() {{(pid=86443) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1842}} Mai 07 19:37:51 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:37:51 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-38dbdf20-5164-4cb9-98da-b30a43e342a4 req-1f797d3d-dc58-455d-919f-66d576f769fb service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:37:51 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:37:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-38dbdf20-5164-4cb9-98da-b30a43e342a4 req-1f797d3d-dc58-455d-919f-66d576f769fb service nova] Releasing lock "refresh_cache-a862cbbd-165b-49ae-b766-9d4f72038ec3" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:37:51 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Acquired lock "refresh_cache-a862cbbd-165b-49ae-b766-9d4f72038ec3" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:37:51 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:37:51 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG nova.utils [-] Task(fn=>, remaining_delay=-0.0019518829999469745 future=) submitted to {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG nova.utils [-] Waiting for the next task {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG nova.utils [-] Received Task(fn=>, remaining_delay=13.073004419999961 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG nova.utils [-] Waitig for the deadline of Task(fn=>, remaining_delay=13.072617778999756 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:37:52 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] VM Stopped (Lifecycle Event) Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 51.64 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-2e9784c0-9e29-4c1e-9798-890e80e4a5a5 req-1c37166a-2fbc-4df2-8d9a-c70b44ff0bdf service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Received event network-vif-deleted-cf792953-3429-4182-ba5e-0b8d056bc7e9 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:52 devstack nova-compute[86443]: INFO nova.compute.manager [req-2e9784c0-9e29-4c1e-9798-890e80e4a5a5 req-1c37166a-2fbc-4df2-8d9a-c70b44ff0bdf service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Neutron deleted interface cf792953-3429-4182-ba5e-0b8d056bc7e9; detaching it from the instance and deleting it from the info cache Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-2e9784c0-9e29-4c1e-9798-890e80e4a5a5 req-1c37166a-2fbc-4df2-8d9a-c70b44ff0bdf service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-4e87a49d-45de-4ce0-b5f2-4ac474b583a5 None None] [instance: 6fc39b29-fdc4-4758-ad2d-2cef146465ac] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-1b3dbe5c-2cf3-4152-923b-7bae6d6aa417 req-6dce876d-295b-4726-9058-c8d3c8a0457c service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Received event network-vif-unplugged-cf792953-3429-4182-ba5e-0b8d056bc7e9 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-1b3dbe5c-2cf3-4152-923b-7bae6d6aa417 req-6dce876d-295b-4726-9058-c8d3c8a0457c service nova] Acquiring lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-1b3dbe5c-2cf3-4152-923b-7bae6d6aa417 req-6dce876d-295b-4726-9058-c8d3c8a0457c service nova] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-1b3dbe5c-2cf3-4152-923b-7bae6d6aa417 req-6dce876d-295b-4726-9058-c8d3c8a0457c service nova] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-1b3dbe5c-2cf3-4152-923b-7bae6d6aa417 req-6dce876d-295b-4726-9058-c8d3c8a0457c service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] No waiting events found dispatching network-vif-unplugged-cf792953-3429-4182-ba5e-0b8d056bc7e9 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-1b3dbe5c-2cf3-4152-923b-7bae6d6aa417 req-6dce876d-295b-4726-9058-c8d3c8a0457c service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Received event network-vif-unplugged-cf792953-3429-4182-ba5e-0b8d056bc7e9 for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:37:52 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:37:52 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Updating instance_info_cache with network_info: [{"id": "47c26b58-3415-4461-a3e5-508baddcf0d1", "address": "fa:16:3e:6a:b3:7f", "network": {"id": "65522bf9-796a-4ec8-936d-4ef7d4630342", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf74477e441f4bff8612e7085667831b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c26b58-34", "ovs_interfaceid": "47c26b58-3415-4461-a3e5-508baddcf0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-2e9784c0-9e29-4c1e-9798-890e80e4a5a5 req-1c37166a-2fbc-4df2-8d9a-c70b44ff0bdf service nova] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Detach interface failed, port_id=cf792953-3429-4182-ba5e-0b8d056bc7e9, reason: Instance 0a37052d-f36d-4e75-8464-1c227b87d6e5 could not be found. {{(pid=86443) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:11820}} Mai 07 19:37:53 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Took 2.17 seconds to deallocate network for instance. Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Releasing lock "refresh_cache-a862cbbd-165b-49ae-b766-9d4f72038ec3" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Instance network_info: |[{"id": "47c26b58-3415-4461-a3e5-508baddcf0d1", "address": "fa:16:3e:6a:b3:7f", "network": {"id": "65522bf9-796a-4ec8-936d-4ef7d4630342", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf74477e441f4bff8612e7085667831b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c26b58-34", "ovs_interfaceid": "47c26b58-3415-4461-a3e5-508baddcf0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2163}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Start _get_guest_xml network_info=[{"id": "47c26b58-3415-4461-a3e5-508baddcf0d1", "address": "fa:16:3e:6a:b3:7f", "network": {"id": "65522bf9-796a-4ec8-936d-4ef7d4630342", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf74477e441f4bff8612e7085667831b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c26b58-34", "ovs_interfaceid": "47c26b58-3415-4461-a3e5-508baddcf0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'image_id': 'e8633b10-b98a-4580-90f8-3091ca40fa29'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8192}} Mai 07 19:37:53 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='e8633b10-b98a-4580-90f8-3091ca40fa29', instance_meta=NovaInstanceMeta(name='tempest-VolumesNegativeTest-instance-264786630', uuid='a862cbbd-165b-49ae-b766-9d4f72038ec3'), owner=OwnerMeta(userid='30ba9350663b488b9fbe17e4811ad191', username='tempest-VolumesNegativeTest-1222330630-project-member', projectid='24e6883199094fd29e71ed13beb20723', projectname='tempest-VolumesNegativeTest-1222330630'), image=ImageMeta(id='e8633b10-b98a-4580-90f8-3091ca40fa29', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "47c26b58-3415-4461-a3e5-508baddcf0d1", "address": "fa:16:3e:6a:b3:7f", "network": {"id": "65522bf9-796a-4ec8-936d-4ef7d4630342", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf74477e441f4bff8612e7085667831b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c26b58-34", "ovs_interfaceid": "47c26b58-3415-4461-a3e5-508baddcf0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175473.5360284) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Searching host: 'devstack' for CPU controller through CGroups V1... {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1783}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] CPU controller missing on host. {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1793}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Searching host: 'devstack' for CPU controller through CGroups V2... {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1802}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] CPU controller found on host. {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1809}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Getting desirable topologies for flavor Flavor(created_at=2026-05-07T17:26:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=192,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=), allow threads: True {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:703}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Flavor limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:488}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Image limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:492}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Flavor pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:528}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Image pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:532}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:570}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:709}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:611}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Got 1 possible topologies {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:641}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:715}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:717}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:37:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesNegativeTest-instance-264786630',display_name='tempest-VolumesNegativeTest-instance-264786630',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-volumesnegativetest-instance-264786630',id=12,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='24e6883199094fd29e71ed13beb20723',ramdisk_id='',reservation_id='r-4xp88z4u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesNegativeTest-1222330630',owner_user_name='tempest-VolumesNegativeTest-1222330630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:37:46Z,user_data=None,user_id='30ba9350663b488b9fbe17e4811ad191',uuid=a862cbbd-165b-49ae-b766-9d4f72038ec3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47c26b58-3415-4461-a3e5-508baddcf0d1", "address": "fa:16:3e:6a:b3:7f", "network": {"id": "65522bf9-796a-4ec8-936d-4ef7d4630342", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf74477e441f4bff8612e7085667831b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c26b58-34", "ovs_interfaceid": "47c26b58-3415-4461-a3e5-508baddcf0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Converting VIF {"id": "47c26b58-3415-4461-a3e5-508baddcf0d1", "address": "fa:16:3e:6a:b3:7f", "network": {"id": "65522bf9-796a-4ec8-936d-4ef7d4630342", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf74477e441f4bff8612e7085667831b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c26b58-34", "ovs_interfaceid": "47c26b58-3415-4461-a3e5-508baddcf0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:b3:7f,bridge_name='br-int',has_traffic_filtering=True,id=47c26b58-3415-4461-a3e5-508baddcf0d1,network=Network(65522bf9-796a-4ec8-936d-4ef7d4630342),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47c26b58-34') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lazy-loading 'pci_devices' on Instance uuid a862cbbd-165b-49ae-b766-9d4f72038ec3 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:53 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] End _get_guest_xml xml= Mai 07 19:37:54 devstack nova-compute[86443]: a862cbbd-165b-49ae-b766-9d4f72038ec3 Mai 07 19:37:54 devstack nova-compute[86443]: instance-0000000c Mai 07 19:37:54 devstack nova-compute[86443]: 196608 Mai 07 19:37:54 devstack nova-compute[86443]: 1 Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: tempest-VolumesNegativeTest-instance-264786630 Mai 07 19:37:54 devstack nova-compute[86443]: 2026-05-07 17:37:53 Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: 192 Mai 07 19:37:54 devstack nova-compute[86443]: 1 Mai 07 19:37:54 devstack nova-compute[86443]: 0 Mai 07 19:37:54 devstack nova-compute[86443]: 0 Mai 07 19:37:54 devstack nova-compute[86443]: 1 Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: True Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: bare Mai 07 19:37:54 devstack nova-compute[86443]: qcow2 Mai 07 19:37:54 devstack nova-compute[86443]: 1 Mai 07 19:37:54 devstack nova-compute[86443]: 0 Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: virtio Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: tempest-VolumesNegativeTest-1222330630-project-member Mai 07 19:37:54 devstack nova-compute[86443]: tempest-VolumesNegativeTest-1222330630 Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: OpenStack Foundation Mai 07 19:37:54 devstack nova-compute[86443]: OpenStack Nova Mai 07 19:37:54 devstack nova-compute[86443]: 33.1.0 Mai 07 19:37:54 devstack nova-compute[86443]: a862cbbd-165b-49ae-b766-9d4f72038ec3 Mai 07 19:37:54 devstack nova-compute[86443]: a862cbbd-165b-49ae-b766-9d4f72038ec3 Mai 07 19:37:54 devstack nova-compute[86443]: Virtual Machine Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: hvm Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: 1 Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: /dev/urandom Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: Mai 07 19:37:54 devstack nova-compute[86443]: {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8199}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Preparing to wait for external event network-vif-plugged-47c26b58-3415-4461-a3e5-508baddcf0d1 {{(pid=86443) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:306}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Acquiring lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:37:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesNegativeTest-instance-264786630',display_name='tempest-VolumesNegativeTest-instance-264786630',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-volumesnegativetest-instance-264786630',id=12,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='24e6883199094fd29e71ed13beb20723',ramdisk_id='',reservation_id='r-4xp88z4u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesNegativeTest-1222330630',owner_user_name='tempest-VolumesNegativeTest-1222330630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:37:46Z,user_data=None,user_id='30ba9350663b488b9fbe17e4811ad191',uuid=a862cbbd-165b-49ae-b766-9d4f72038ec3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47c26b58-3415-4461-a3e5-508baddcf0d1", "address": "fa:16:3e:6a:b3:7f", "network": {"id": "65522bf9-796a-4ec8-936d-4ef7d4630342", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf74477e441f4bff8612e7085667831b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c26b58-34", "ovs_interfaceid": "47c26b58-3415-4461-a3e5-508baddcf0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) plug /opt/stack/nova/nova/virt/libvirt/vif.py:763}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Converting VIF {"id": "47c26b58-3415-4461-a3e5-508baddcf0d1", "address": "fa:16:3e:6a:b3:7f", "network": {"id": "65522bf9-796a-4ec8-936d-4ef7d4630342", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf74477e441f4bff8612e7085667831b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c26b58-34", "ovs_interfaceid": "47c26b58-3415-4461-a3e5-508baddcf0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:b3:7f,bridge_name='br-int',has_traffic_filtering=True,id=47c26b58-3415-4461-a3e5-508baddcf0d1,network=Network(65522bf9-796a-4ec8-936d-4ef7d4630342),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47c26b58-34') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG os_vif [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:b3:7f,bridge_name='br-int',has_traffic_filtering=True,id=47c26b58-3415-4461-a3e5-508baddcf0d1,network=Network(65522bf9-796a-4ec8-936d-4ef7d4630342),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47c26b58-34') {{(pid=86443) plug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:76}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9ae47d24-ae3c-5817-8127-57f27ba9ca05', '_type': 'linux-noop'}}, row=False) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap47c26b58-34, may_exist=True, interface_attrs={}) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap47c26b58-34, col_values=(('qos', UUID('76c5393e-4ba0-4c40-983f-f4ff77614514')),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap47c26b58-34, col_values=(('external_ids', {'iface-id': '47c26b58-3415-4461-a3e5-508baddcf0d1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:b3:7f', 'vm-uuid': 'a862cbbd-165b-49ae-b766-9d4f72038ec3'}),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:54 devstack nova-compute[86443]: INFO os_vif [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:b3:7f,bridge_name='br-int',has_traffic_filtering=True,id=47c26b58-3415-4461-a3e5-508baddcf0d1,network=Network(65522bf9-796a-4ec8-936d-4ef7d4630342),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47c26b58-34') Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:37:54 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.138s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:55 devstack nova-compute[86443]: INFO nova.scheduler.client.report [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Deleted allocations for instance 0a37052d-f36d-4e75-8464-1c227b87d6e5 Mai 07 19:37:55 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-a14070a3-c5fb-4f26-9fa1-1a6465683f96 req-317be7ec-6061-457a-a681-f607869de945 service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Received event network-vif-unplugged-47c26b58-3415-4461-a3e5-508baddcf0d1 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-a14070a3-c5fb-4f26-9fa1-1a6465683f96 req-317be7ec-6061-457a-a681-f607869de945 service nova] Acquiring lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-a14070a3-c5fb-4f26-9fa1-1a6465683f96 req-317be7ec-6061-457a-a681-f607869de945 service nova] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-a14070a3-c5fb-4f26-9fa1-1a6465683f96 req-317be7ec-6061-457a-a681-f607869de945 service nova] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:55 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-a14070a3-c5fb-4f26-9fa1-1a6465683f96 req-317be7ec-6061-457a-a681-f607869de945 service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] No event matching network-vif-unplugged-47c26b58-3415-4461-a3e5-508baddcf0d1 in dict_keys([('network-vif-plugged', '47c26b58-3415-4461-a3e5-508baddcf0d1')]) {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:348}} Mai 07 19:37:55 devstack nova-compute[86443]: WARNING nova.compute.manager [req-a14070a3-c5fb-4f26-9fa1-1a6465683f96 req-317be7ec-6061-457a-a681-f607869de945 service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Received unexpected event network-vif-unplugged-47c26b58-3415-4461-a3e5-508baddcf0d1 for instance with vm_state building and task_state spawning. Mai 07 19:37:55 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:37:55 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] No VIF found with MAC fa:16:3e:6a:b3:7f, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:37:55 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:55 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:55 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:56 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-69e15d7e-c106-49b1-aa8f-e4ff99597fef tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "0a37052d-f36d-4e75-8464-1c227b87d6e5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 6.803s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:56 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:56 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:56 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:56 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:56 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:56 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:57 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Started> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:37:57 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] VM Started (Lifecycle Event) Mai 07 19:37:57 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:37:57 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Paused> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:37:57 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] VM Paused (Lifecycle Event) Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-e047d6ac-315d-4e2c-86a5-04e19f5dcaea req-94f9bc5c-c552-4473-858e-993e0b00b961 service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Received event network-vif-plugged-47c26b58-3415-4461-a3e5-508baddcf0d1 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-e047d6ac-315d-4e2c-86a5-04e19f5dcaea req-94f9bc5c-c552-4473-858e-993e0b00b961 service nova] Acquiring lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-e047d6ac-315d-4e2c-86a5-04e19f5dcaea req-94f9bc5c-c552-4473-858e-993e0b00b961 service nova] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-e047d6ac-315d-4e2c-86a5-04e19f5dcaea req-94f9bc5c-c552-4473-858e-993e0b00b961 service nova] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-e047d6ac-315d-4e2c-86a5-04e19f5dcaea req-94f9bc5c-c552-4473-858e-993e0b00b961 service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Processing event network-vif-plugged-47c26b58-3415-4461-a3e5-508baddcf0d1 {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11746}} Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=86443) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:601}} Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Guest created on hypervisor {{(pid=86443) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4893}} Mai 07 19:37:58 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Instance spawned successfully. Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1012}} Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Resumed> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:37:58 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] VM Resumed (Lifecycle Event) Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Found default for hw_cdrom_bus of ide {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Found default for hw_disk_bus of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Found default for hw_input_bus of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Found default for hw_pointer_model of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Found default for hw_video_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Found default for hw_vif_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:37:58 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:37:59 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:37:59 devstack nova-compute[86443]: INFO nova.compute.manager [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Took 12.22 seconds to spawn the instance on the hypervisor. Mai 07 19:37:59 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:37:59 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] During sync_power_state the instance has a pending task (spawning). Skip. Mai 07 19:37:59 devstack nova-compute[86443]: INFO nova.compute.manager [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Took 17.96 seconds to build instance. Mai 07 19:38:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef35ebb3-b2e1-4a9a-a824-2c62faf5f61e tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 19.490s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:00 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-9d20c8d5-c6e0-4060-b5e8-0fca611963cf req-c5d362ad-aecd-4747-a98b-8ae96221df9a service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Received event network-vif-plugged-47c26b58-3415-4461-a3e5-508baddcf0d1 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:38:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-9d20c8d5-c6e0-4060-b5e8-0fca611963cf req-c5d362ad-aecd-4747-a98b-8ae96221df9a service nova] Acquiring lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-9d20c8d5-c6e0-4060-b5e8-0fca611963cf req-c5d362ad-aecd-4747-a98b-8ae96221df9a service nova] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-9d20c8d5-c6e0-4060-b5e8-0fca611963cf req-c5d362ad-aecd-4747-a98b-8ae96221df9a service nova] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:00 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-9d20c8d5-c6e0-4060-b5e8-0fca611963cf req-c5d362ad-aecd-4747-a98b-8ae96221df9a service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] No waiting events found dispatching network-vif-plugged-47c26b58-3415-4461-a3e5-508baddcf0d1 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:38:00 devstack nova-compute[86443]: WARNING nova.compute.manager [req-9d20c8d5-c6e0-4060-b5e8-0fca611963cf req-c5d362ad-aecd-4747-a98b-8ae96221df9a service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Received unexpected event network-vif-plugged-47c26b58-3415-4461-a3e5-508baddcf0d1 for instance with vm_state active and task_state None. Mai 07 19:38:00 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:01 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Acquiring lock "a862cbbd-165b-49ae-b766-9d4f72038ec3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Acquiring lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:04 devstack nova-compute[86443]: INFO nova.compute.manager [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Terminating instance Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Start destroying the instance on the hypervisor. {{(pid=86443) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3332}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG nova.utils [-] Task(fn=>, remaining_delay=-0.002461118000155693 future=) submitted to {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG nova.utils [-] Waiting for the next task {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] Emitting event Stopped> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:38:05 devstack nova-compute[86443]: INFO nova.compute.manager [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] VM Stopped (Lifecycle Event) Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG nova.utils [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Queued Task(fn=>, remaining_delay=14.998577472999841 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG nova.utils [-] Received Task(fn=>, remaining_delay=14.994915554999807 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG nova.utils [-] Waitig for the deadline of Task(fn=>, remaining_delay=14.994458792999922 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:38:05 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Instance destroyed successfully. Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lazy-loading 'resources' on Instance uuid a862cbbd-165b-49ae-b766-9d4f72038ec3 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-8f70473b-55cf-4fee-a7cd-1fc64fb56ee1 tempest-AttachVolumeShelveTestJSON-60761633 tempest-AttachVolumeShelveTestJSON-60761633-project-member] [instance: 0a37052d-f36d-4e75-8464-1c227b87d6e5] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-cd5f5c65-a690-4e3a-ad06-22800e5e3bbd req-1adfb9cf-627b-4dce-88c1-c643a41cf57b service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Received event network-vif-unplugged-47c26b58-3415-4461-a3e5-508baddcf0d1 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-cd5f5c65-a690-4e3a-ad06-22800e5e3bbd req-1adfb9cf-627b-4dce-88c1-c643a41cf57b service nova] Acquiring lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-cd5f5c65-a690-4e3a-ad06-22800e5e3bbd req-1adfb9cf-627b-4dce-88c1-c643a41cf57b service nova] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-cd5f5c65-a690-4e3a-ad06-22800e5e3bbd req-1adfb9cf-627b-4dce-88c1-c643a41cf57b service nova] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-cd5f5c65-a690-4e3a-ad06-22800e5e3bbd req-1adfb9cf-627b-4dce-88c1-c643a41cf57b service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] No waiting events found dispatching network-vif-unplugged-47c26b58-3415-4461-a3e5-508baddcf0d1 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-cd5f5c65-a690-4e3a-ad06-22800e5e3bbd req-1adfb9cf-627b-4dce-88c1-c643a41cf57b service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Received event network-vif-unplugged-47c26b58-3415-4461-a3e5-508baddcf0d1 for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:37:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesNegativeTest-instance-264786630',display_name='tempest-VolumesNegativeTest-instance-264786630',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-volumesnegativetest-instance-264786630',id=12,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-05-07T17:37:59Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='24e6883199094fd29e71ed13beb20723',ramdisk_id='',reservation_id='r-4xp88z4u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesNegativeTest-1222330630',owner_user_name='tempest-VolumesNegativeTest-1222330630-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2026-05-07T17:37:59Z,user_data=None,user_id='30ba9350663b488b9fbe17e4811ad191',uuid=a862cbbd-165b-49ae-b766-9d4f72038ec3,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47c26b58-3415-4461-a3e5-508baddcf0d1", "address": "fa:16:3e:6a:b3:7f", "network": {"id": "65522bf9-796a-4ec8-936d-4ef7d4630342", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf74477e441f4bff8612e7085667831b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c26b58-34", "ovs_interfaceid": "47c26b58-3415-4461-a3e5-508baddcf0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:881}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Converting VIF {"id": "47c26b58-3415-4461-a3e5-508baddcf0d1", "address": "fa:16:3e:6a:b3:7f", "network": {"id": "65522bf9-796a-4ec8-936d-4ef7d4630342", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf74477e441f4bff8612e7085667831b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c26b58-34", "ovs_interfaceid": "47c26b58-3415-4461-a3e5-508baddcf0d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:b3:7f,bridge_name='br-int',has_traffic_filtering=True,id=47c26b58-3415-4461-a3e5-508baddcf0d1,network=Network(65522bf9-796a-4ec8-936d-4ef7d4630342),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47c26b58-34') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG os_vif [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:b3:7f,bridge_name='br-int',has_traffic_filtering=True,id=47c26b58-3415-4461-a3e5-508baddcf0d1,network=Network(65522bf9-796a-4ec8-936d-4ef7d4630342),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47c26b58-34') {{(pid=86443) unplug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:109}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap47c26b58-34, bridge=br-int, if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=76c5393e-4ba0-4c40-983f-f4ff77614514) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:38:05 devstack nova-compute[86443]: INFO os_vif [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:b3:7f,bridge_name='br-int',has_traffic_filtering=True,id=47c26b58-3415-4461-a3e5-508baddcf0d1,network=Network(65522bf9-796a-4ec8-936d-4ef7d4630342),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47c26b58-34') Mai 07 19:38:05 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Deleting instance files /opt/stack/data/nova/instances/a862cbbd-165b-49ae-b766-9d4f72038ec3_del Mai 07 19:38:05 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Deletion of /opt/stack/data/nova/instances/a862cbbd-165b-49ae-b766-9d4f72038ec3_del complete Mai 07 19:38:06 devstack nova-compute[86443]: INFO nova.compute.manager [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Took 1.38 seconds to destroy the instance on the hypervisor. Mai 07 19:38:06 devstack nova-compute[86443]: DEBUG oslo.service.backend._common.loopingcall [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=86443) func /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_common/loopingcall.py:419}} Mai 07 19:38:06 devstack nova-compute[86443]: DEBUG nova.compute.manager [-] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Deallocating network for instance {{(pid=86443) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2456}} Mai 07 19:38:06 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] deallocate_for_instance() {{(pid=86443) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1842}} Mai 07 19:38:06 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:06 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:07 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "17b05464-04bc-4019-8dfd-e4dd51eed233" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:07 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:07 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-b7ba47e8-685b-4467-aadb-4a4be9b3d340 req-d80d43fc-9c2f-46e5-b054-bf5c837dd175 service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Received event network-vif-deleted-47c26b58-3415-4461-a3e5-508baddcf0d1 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:38:07 devstack nova-compute[86443]: INFO nova.compute.manager [req-b7ba47e8-685b-4467-aadb-4a4be9b3d340 req-d80d43fc-9c2f-46e5-b054-bf5c837dd175 service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Neutron deleted interface 47c26b58-3415-4461-a3e5-508baddcf0d1; detaching it from the instance and deleting it from the info cache Mai 07 19:38:07 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-b7ba47e8-685b-4467-aadb-4a4be9b3d340 req-d80d43fc-9c2f-46e5-b054-bf5c837dd175 service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:38:07 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:38:07 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Starting instance... {{(pid=86443) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2605}} Mai 07 19:38:07 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-b7ba47e8-685b-4467-aadb-4a4be9b3d340 req-d80d43fc-9c2f-46e5-b054-bf5c837dd175 service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Detach interface failed, port_id=47c26b58-3415-4461-a3e5-508baddcf0d1, reason: Instance a862cbbd-165b-49ae-b766-9d4f72038ec3 could not be found. {{(pid=86443) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:11820}} Mai 07 19:38:07 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-9124072c-25d8-45a9-a4ff-222872920615 req-0be99185-1527-492a-b1d6-a60cf6b5ce2d service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Received event network-vif-unplugged-47c26b58-3415-4461-a3e5-508baddcf0d1 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:38:07 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-9124072c-25d8-45a9-a4ff-222872920615 req-0be99185-1527-492a-b1d6-a60cf6b5ce2d service nova] Acquiring lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:07 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-9124072c-25d8-45a9-a4ff-222872920615 req-0be99185-1527-492a-b1d6-a60cf6b5ce2d service nova] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:07 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-9124072c-25d8-45a9-a4ff-222872920615 req-0be99185-1527-492a-b1d6-a60cf6b5ce2d service nova] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:07 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-9124072c-25d8-45a9-a4ff-222872920615 req-0be99185-1527-492a-b1d6-a60cf6b5ce2d service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] No waiting events found dispatching network-vif-unplugged-47c26b58-3415-4461-a3e5-508baddcf0d1 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:38:07 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-9124072c-25d8-45a9-a4ff-222872920615 req-0be99185-1527-492a-b1d6-a60cf6b5ce2d service nova] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Received event network-vif-unplugged-47c26b58-3415-4461-a3e5-508baddcf0d1 for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:38:08 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Took 1.65 seconds to deallocate network for instance. Mai 07 19:38:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:08 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=86443) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2630}} Mai 07 19:38:08 devstack nova-compute[86443]: INFO nova.compute.claims [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Claim successful on node devstack Mai 07 19:38:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:09 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:38:09 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:38:09 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:38:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:10 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.165s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:10 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Start building networks asynchronously for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3003}} Mai 07 19:38:10 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 1.679s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:10 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:38:10 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:38:10 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Allocating IP information in the background. {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2148}} Mai 07 19:38:10 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] allocate_for_instance() {{(pid=86443) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1187}} Mai 07 19:38:10 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:10 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:10 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:38:11 devstack nova-compute[86443]: DEBUG nova.policy [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd98e17081ef14e01bf76138813a4d56a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1ea2fed9f654419a4de1a6168d279ab', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=86443) authorize /opt/stack/nova/nova/policy.py:192}} Mai 07 19:38:11 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Mai 07 19:38:11 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.158s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:11 devstack nova-compute[86443]: INFO nova.scheduler.client.report [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Deleted allocations for instance a862cbbd-165b-49ae-b766-9d4f72038ec3 Mai 07 19:38:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:11 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Start building block device mappings for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3038}} Mai 07 19:38:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Acquiring lock "b2741568-56f8-41e0-a67f-5b8951faeef5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:12 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Successfully created port: e4bf19ba-d2b9-4920-a784-c31b806fc92a {{(pid=86443) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:529}} Mai 07 19:38:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-63a573bb-8358-4e4b-b82b-bb007c66f803 tempest-VolumesNegativeTest-1222330630 tempest-VolumesNegativeTest-1222330630-project-member] Lock "a862cbbd-165b-49ae-b766-9d4f72038ec3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 8.038s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:12 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Starting instance... {{(pid=86443) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2605}} Mai 07 19:38:12 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Start spawning the instance on the hypervisor. {{(pid=86443) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2811}} Mai 07 19:38:12 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Creating instance directory {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5215}} Mai 07 19:38:12 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Creating image(s) Mai 07 19:38:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "/opt/stack/data/nova/instances/17b05464-04bc-4019-8dfd-e4dd51eed233/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "/opt/stack/data/nova/instances/17b05464-04bc-4019-8dfd-e4dd51eed233/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "/opt/stack/data/nova/instances/17b05464-04bc-4019-8dfd-e4dd51eed233/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.003s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:12 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:38:12 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:38:12 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.143s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "d8d56ca44922efe85609619d01052c20f44c056a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.123s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/17b05464-04bc-4019-8dfd-e4dd51eed233/disk 1073741824 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/17b05464-04bc-4019-8dfd-e4dd51eed233/disk 1073741824" returned: 0 in 0.080s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.231s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=86443) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2630}} Mai 07 19:38:13 devstack nova-compute[86443]: INFO nova.compute.claims [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Claim successful on node devstack Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.215s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Checking if we can resize image /opt/stack/data/nova/instances/17b05464-04bc-4019-8dfd-e4dd51eed233/disk. size=1073741824 {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:178}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/17b05464-04bc-4019-8dfd-e4dd51eed233/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Successfully updated port: e4bf19ba-d2b9-4920-a784-c31b806fc92a {{(pid=86443) _update_port /opt/stack/nova/nova/network/neutron.py:567}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-ea49ce06-11c3-49b7-a921-ad77be700443 req-1e248cb0-2990-482c-bfe4-5d366305e548 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Received event network-changed-e4bf19ba-d2b9-4920-a784-c31b806fc92a {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-ea49ce06-11c3-49b7-a921-ad77be700443 req-1e248cb0-2990-482c-bfe4-5d366305e548 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Refreshing instance network info cache due to event network-changed-e4bf19ba-d2b9-4920-a784-c31b806fc92a. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-ea49ce06-11c3-49b7-a921-ad77be700443 req-1e248cb0-2990-482c-bfe4-5d366305e548 service nova] Acquiring lock "refresh_cache-17b05464-04bc-4019-8dfd-e4dd51eed233" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-ea49ce06-11c3-49b7-a921-ad77be700443 req-1e248cb0-2990-482c-bfe4-5d366305e548 service nova] Acquired lock "refresh_cache-17b05464-04bc-4019-8dfd-e4dd51eed233" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-ea49ce06-11c3-49b7-a921-ad77be700443 req-1e248cb0-2990-482c-bfe4-5d366305e548 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Refreshing network info cache for port e4bf19ba-d2b9-4920-a784-c31b806fc92a {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/17b05464-04bc-4019-8dfd-e4dd51eed233/disk --force-share --output=json" returned: 0 in 0.183s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Cannot resize image /opt/stack/data/nova/instances/17b05464-04bc-4019-8dfd-e4dd51eed233/disk to a smaller size. {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:184}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Created local disks {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5347}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Ensure instance console log exists: /opt/stack/data/nova/instances/17b05464-04bc-4019-8dfd-e4dd51eed233/console.log {{(pid=86443) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:5094}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:13 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "refresh_cache-17b05464-04bc-4019-8dfd-e4dd51eed233" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:38:14 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-ea49ce06-11c3-49b7-a921-ad77be700443 req-1e248cb0-2990-482c-bfe4-5d366305e548 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:14 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-ea49ce06-11c3-49b7-a921-ad77be700443 req-1e248cb0-2990-482c-bfe4-5d366305e548 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:38:14 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-ea49ce06-11c3-49b7-a921-ad77be700443 req-1e248cb0-2990-482c-bfe4-5d366305e548 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:38:14 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:38:14 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:38:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-ea49ce06-11c3-49b7-a921-ad77be700443 req-1e248cb0-2990-482c-bfe4-5d366305e548 service nova] Releasing lock "refresh_cache-17b05464-04bc-4019-8dfd-e4dd51eed233" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:38:14 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquired lock "refresh_cache-17b05464-04bc-4019-8dfd-e4dd51eed233" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:38:14 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:38:15 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:38:15 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:38:15 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.196s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:15 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Start building networks asynchronously for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3003}} Mai 07 19:38:15 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:15 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Updating instance_info_cache with network_info: [{"id": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "address": "fa:16:3e:71:2a:5b", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4bf19ba-d2", "ovs_interfaceid": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:38:15 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Allocating IP information in the background. {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2148}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] allocate_for_instance() {{(pid=86443) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1187}} Mai 07 19:38:16 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:16 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.policy [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '21cc0090dbde493eb6cf6ce289991c0f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4d37f2809da402f9ff2f13e7afa7372', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=86443) authorize /opt/stack/nova/nova/policy.py:192}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Releasing lock "refresh_cache-17b05464-04bc-4019-8dfd-e4dd51eed233" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Instance network_info: |[{"id": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "address": "fa:16:3e:71:2a:5b", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4bf19ba-d2", "ovs_interfaceid": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2163}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Start _get_guest_xml network_info=[{"id": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "address": "fa:16:3e:71:2a:5b", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4bf19ba-d2", "ovs_interfaceid": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'image_id': 'e8633b10-b98a-4580-90f8-3091ca40fa29'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8192}} Mai 07 19:38:16 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='e8633b10-b98a-4580-90f8-3091ca40fa29', instance_meta=NovaInstanceMeta(name='tempest-AttachVolumeNegativeTest-server-1979295383', uuid='17b05464-04bc-4019-8dfd-e4dd51eed233'), owner=OwnerMeta(userid='d98e17081ef14e01bf76138813a4d56a', username='tempest-AttachVolumeNegativeTest-429871213-project-member', projectid='b1ea2fed9f654419a4de1a6168d279ab', projectname='tempest-AttachVolumeNegativeTest-429871213'), image=ImageMeta(id='e8633b10-b98a-4580-90f8-3091ca40fa29', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "address": "fa:16:3e:71:2a:5b", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4bf19ba-d2", "ovs_interfaceid": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175496.3780959) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Searching host: 'devstack' for CPU controller through CGroups V1... {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1783}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CPU controller missing on host. {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1793}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Searching host: 'devstack' for CPU controller through CGroups V2... {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1802}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CPU controller found on host. {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1809}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Getting desirable topologies for flavor Flavor(created_at=2026-05-07T17:26:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=192,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=), allow threads: True {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:703}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Flavor limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:488}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Image limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:492}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Flavor pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:528}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Image pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:532}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:570}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:709}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:611}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Got 1 possible topologies {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:641}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:715}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:717}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:38:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1979295383',display_name='tempest-AttachVolumeNegativeTest-server-1979295383',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumenegativetest-server-1979295383',id=13,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMwPjyPNCSPtoqoqYlAGIuvsr9bwVFpPeklDJeHZSk/RaYM/Sy1EfiGX+0yUBpXXT4QxYcF2uzijc18TPZPHMEt3dMBq14hDEz5D1pUl9mw7SgCScXfQb+m05E9Nyxs1qA==',key_name='tempest-keypair-658850004',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1ea2fed9f654419a4de1a6168d279ab',ramdisk_id='',reservation_id='r-s728gv3u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-429871213',owner_user_name='tempest-AttachVolumeNegativeTest-429871213-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:38:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d98e17081ef14e01bf76138813a4d56a',uuid=17b05464-04bc-4019-8dfd-e4dd51eed233,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "address": "fa:16:3e:71:2a:5b", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4bf19ba-d2", "ovs_interfaceid": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converting VIF {"id": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "address": "fa:16:3e:71:2a:5b", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4bf19ba-d2", "ovs_interfaceid": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:2a:5b,bridge_name='br-int',has_traffic_filtering=True,id=e4bf19ba-d2b9-4920-a784-c31b806fc92a,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4bf19ba-d2') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lazy-loading 'pci_devices' on Instance uuid 17b05464-04bc-4019-8dfd-e4dd51eed233 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:38:16 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] End _get_guest_xml xml= Mai 07 19:38:16 devstack nova-compute[86443]: 17b05464-04bc-4019-8dfd-e4dd51eed233 Mai 07 19:38:16 devstack nova-compute[86443]: instance-0000000d Mai 07 19:38:16 devstack nova-compute[86443]: 196608 Mai 07 19:38:16 devstack nova-compute[86443]: 1 Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: tempest-AttachVolumeNegativeTest-server-1979295383 Mai 07 19:38:16 devstack nova-compute[86443]: 2026-05-07 17:38:16 Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: 192 Mai 07 19:38:16 devstack nova-compute[86443]: 1 Mai 07 19:38:16 devstack nova-compute[86443]: 0 Mai 07 19:38:16 devstack nova-compute[86443]: 0 Mai 07 19:38:16 devstack nova-compute[86443]: 1 Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: True Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: bare Mai 07 19:38:16 devstack nova-compute[86443]: qcow2 Mai 07 19:38:16 devstack nova-compute[86443]: 1 Mai 07 19:38:16 devstack nova-compute[86443]: 0 Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: virtio Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: tempest-AttachVolumeNegativeTest-429871213-project-member Mai 07 19:38:16 devstack nova-compute[86443]: tempest-AttachVolumeNegativeTest-429871213 Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: OpenStack Foundation Mai 07 19:38:16 devstack nova-compute[86443]: OpenStack Nova Mai 07 19:38:16 devstack nova-compute[86443]: 33.1.0 Mai 07 19:38:16 devstack nova-compute[86443]: 17b05464-04bc-4019-8dfd-e4dd51eed233 Mai 07 19:38:16 devstack nova-compute[86443]: 17b05464-04bc-4019-8dfd-e4dd51eed233 Mai 07 19:38:16 devstack nova-compute[86443]: Virtual Machine Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: hvm Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: 1 Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: /dev/urandom Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: Mai 07 19:38:16 devstack nova-compute[86443]: {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8199}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Preparing to wait for external event network-vif-plugged-e4bf19ba-d2b9-4920-a784-c31b806fc92a {{(pid=86443) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:306}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:38:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1979295383',display_name='tempest-AttachVolumeNegativeTest-server-1979295383',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumenegativetest-server-1979295383',id=13,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMwPjyPNCSPtoqoqYlAGIuvsr9bwVFpPeklDJeHZSk/RaYM/Sy1EfiGX+0yUBpXXT4QxYcF2uzijc18TPZPHMEt3dMBq14hDEz5D1pUl9mw7SgCScXfQb+m05E9Nyxs1qA==',key_name='tempest-keypair-658850004',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1ea2fed9f654419a4de1a6168d279ab',ramdisk_id='',reservation_id='r-s728gv3u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-429871213',owner_user_name='tempest-AttachVolumeNegativeTest-429871213-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:38:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d98e17081ef14e01bf76138813a4d56a',uuid=17b05464-04bc-4019-8dfd-e4dd51eed233,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "address": "fa:16:3e:71:2a:5b", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4bf19ba-d2", "ovs_interfaceid": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) plug /opt/stack/nova/nova/virt/libvirt/vif.py:763}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converting VIF {"id": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "address": "fa:16:3e:71:2a:5b", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4bf19ba-d2", "ovs_interfaceid": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:2a:5b,bridge_name='br-int',has_traffic_filtering=True,id=e4bf19ba-d2b9-4920-a784-c31b806fc92a,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4bf19ba-d2') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG os_vif [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:2a:5b,bridge_name='br-int',has_traffic_filtering=True,id=e4bf19ba-d2b9-4920-a784-c31b806fc92a,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4bf19ba-d2') {{(pid=86443) plug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:76}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '20de16a9-6836-5500-8012-bfd533ee4b8f', '_type': 'linux-noop'}}, row=False) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape4bf19ba-d2, may_exist=True, interface_attrs={}) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tape4bf19ba-d2, col_values=(('qos', UUID('e2b6a851-4ded-4c62-b0b4-9b56603b2047')),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tape4bf19ba-d2, col_values=(('external_ids', {'iface-id': 'e4bf19ba-d2b9-4920-a784-c31b806fc92a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:71:2a:5b', 'vm-uuid': '17b05464-04bc-4019-8dfd-e4dd51eed233'}),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:38:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:16 devstack nova-compute[86443]: INFO os_vif [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:2a:5b,bridge_name='br-int',has_traffic_filtering=True,id=e4bf19ba-d2b9-4920-a784-c31b806fc92a,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4bf19ba-d2') Mai 07 19:38:17 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Start building block device mappings for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3038}} Mai 07 19:38:17 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Successfully created port: 8f188046-7a8f-4a3e-aae7-85f17ad61655 {{(pid=86443) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:529}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Start spawning the instance on the hypervisor. {{(pid=86443) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2811}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Creating instance directory {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5215}} Mai 07 19:38:18 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Creating image(s) Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Acquiring lock "/opt/stack/data/nova/instances/b2741568-56f8-41e0-a67f-5b8951faeef5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "/opt/stack/data/nova/instances/b2741568-56f8-41e0-a67f-5b8951faeef5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "/opt/stack/data/nova/instances/b2741568-56f8-41e0-a67f-5b8951faeef5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.155s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Acquiring lock "d8d56ca44922efe85609619d01052c20f44c056a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Successfully updated port: 8f188046-7a8f-4a3e-aae7-85f17ad61655 {{(pid=86443) _update_port /opt/stack/nova/nova/network/neutron.py:567}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-3da75c84-00ea-463f-a4a3-56d729b8d30c req-fc4b1d05-922b-4cba-a0c1-20ef7eb7b62a service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Received event network-changed-8f188046-7a8f-4a3e-aae7-85f17ad61655 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-3da75c84-00ea-463f-a4a3-56d729b8d30c req-fc4b1d05-922b-4cba-a0c1-20ef7eb7b62a service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Refreshing instance network info cache due to event network-changed-8f188046-7a8f-4a3e-aae7-85f17ad61655. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-3da75c84-00ea-463f-a4a3-56d729b8d30c req-fc4b1d05-922b-4cba-a0c1-20ef7eb7b62a service nova] Acquiring lock "refresh_cache-b2741568-56f8-41e0-a67f-5b8951faeef5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-3da75c84-00ea-463f-a4a3-56d729b8d30c req-fc4b1d05-922b-4cba-a0c1-20ef7eb7b62a service nova] Acquired lock "refresh_cache-b2741568-56f8-41e0-a67f-5b8951faeef5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-3da75c84-00ea-463f-a4a3-56d729b8d30c req-fc4b1d05-922b-4cba-a0c1-20ef7eb7b62a service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Refreshing network info cache for port 8f188046-7a8f-4a3e-aae7-85f17ad61655 {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.171s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/b2741568-56f8-41e0-a67f-5b8951faeef5/disk 1073741824 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] No VIF found with MAC fa:16:3e:71:2a:5b, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/b2741568-56f8-41e0-a67f-5b8951faeef5/disk 1073741824" returned: 0 in 0.075s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.274s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.281s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Checking if we can resize image /opt/stack/data/nova/instances/b2741568-56f8-41e0-a67f-5b8951faeef5/disk. size=1073741824 {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:178}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b2741568-56f8-41e0-a67f-5b8951faeef5/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:38:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Acquiring lock "refresh_cache-b2741568-56f8-41e0-a67f-5b8951faeef5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:38:19 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-3da75c84-00ea-463f-a4a3-56d729b8d30c req-fc4b1d05-922b-4cba-a0c1-20ef7eb7b62a service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b2741568-56f8-41e0-a67f-5b8951faeef5/disk --force-share --output=json" returned: 0 in 0.313s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Cannot resize image /opt/stack/data/nova/instances/b2741568-56f8-41e0-a67f-5b8951faeef5/disk to a smaller size. {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:184}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Created local disks {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5347}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Ensure instance console log exists: /opt/stack/data/nova/instances/b2741568-56f8-41e0-a67f-5b8951faeef5/console.log {{(pid=86443) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:5094}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-3da75c84-00ea-463f-a4a3-56d729b8d30c req-fc4b1d05-922b-4cba-a0c1-20ef7eb7b62a service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-fbe040d3-f093-4a22-827b-7d903740b543 req-215a199f-2cac-47c9-bc9c-14a17bf8c292 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Received event network-vif-plugged-e4bf19ba-d2b9-4920-a784-c31b806fc92a {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-fbe040d3-f093-4a22-827b-7d903740b543 req-215a199f-2cac-47c9-bc9c-14a17bf8c292 service nova] Acquiring lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-fbe040d3-f093-4a22-827b-7d903740b543 req-215a199f-2cac-47c9-bc9c-14a17bf8c292 service nova] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-fbe040d3-f093-4a22-827b-7d903740b543 req-215a199f-2cac-47c9-bc9c-14a17bf8c292 service nova] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-fbe040d3-f093-4a22-827b-7d903740b543 req-215a199f-2cac-47c9-bc9c-14a17bf8c292 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Processing event network-vif-plugged-e4bf19ba-d2b9-4920-a784-c31b806fc92a {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11746}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-3da75c84-00ea-463f-a4a3-56d729b8d30c req-fc4b1d05-922b-4cba-a0c1-20ef7eb7b62a service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:19 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-3da75c84-00ea-463f-a4a3-56d729b8d30c req-fc4b1d05-922b-4cba-a0c1-20ef7eb7b62a service nova] Releasing lock "refresh_cache-b2741568-56f8-41e0-a67f-5b8951faeef5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Acquired lock "refresh_cache-b2741568-56f8-41e0-a67f-5b8951faeef5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Started> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:38:20 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] VM Started (Lifecycle Event) Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=86443) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:601}} Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Guest created on hypervisor {{(pid=86443) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4893}} Mai 07 19:38:20 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Instance spawned successfully. Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1012}} Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.utils [-] Task(fn=>, remaining_delay=-0.005752301000029547 future=) submitted to {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.utils [-] Waiting for the next task {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Emitting event Stopped> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:38:20 devstack nova-compute[86443]: INFO nova.compute.manager [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] VM Stopped (Lifecycle Event) Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Found default for hw_cdrom_bus of ide {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Found default for hw_disk_bus of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Found default for hw_input_bus of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Found default for hw_pointer_model of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Found default for hw_video_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Found default for hw_vif_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:38:20 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: a862cbbd-165b-49ae-b766-9d4f72038ec3] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:38:21 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:21 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] During sync_power_state the instance has a pending task (spawning). Skip. Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Paused> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:38:21 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] VM Paused (Lifecycle Event) Mai 07 19:38:21 devstack nova-compute[86443]: INFO nova.compute.manager [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Took 8.33 seconds to spawn the instance on the hypervisor. Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Updating instance_info_cache with network_info: [{"id": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "address": "fa:16:3e:ac:15:e8", "network": {"id": "048c8620-4162-41b0-baec-2d18d63f2ed8", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-711623497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f188046-7a", "ovs_interfaceid": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-372e2118-888b-40c5-accb-218f93908d89 req-1af0ab9f-d69d-49c6-bf8b-9d5d794558f8 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Received event network-vif-plugged-e4bf19ba-d2b9-4920-a784-c31b806fc92a {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-372e2118-888b-40c5-accb-218f93908d89 req-1af0ab9f-d69d-49c6-bf8b-9d5d794558f8 service nova] Acquiring lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-372e2118-888b-40c5-accb-218f93908d89 req-1af0ab9f-d69d-49c6-bf8b-9d5d794558f8 service nova] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-372e2118-888b-40c5-accb-218f93908d89 req-1af0ab9f-d69d-49c6-bf8b-9d5d794558f8 service nova] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.012s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-372e2118-888b-40c5-accb-218f93908d89 req-1af0ab9f-d69d-49c6-bf8b-9d5d794558f8 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] No waiting events found dispatching network-vif-plugged-e4bf19ba-d2b9-4920-a784-c31b806fc92a {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:38:21 devstack nova-compute[86443]: WARNING nova.compute.manager [req-372e2118-888b-40c5-accb-218f93908d89 req-1af0ab9f-d69d-49c6-bf8b-9d5d794558f8 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Received unexpected event network-vif-plugged-e4bf19ba-d2b9-4920-a784-c31b806fc92a for instance with vm_state active and task_state None. Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Resumed> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:38:21 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] VM Resumed (Lifecycle Event) Mai 07 19:38:21 devstack nova-compute[86443]: INFO nova.compute.manager [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Took 13.67 seconds to build instance. Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Releasing lock "refresh_cache-b2741568-56f8-41e0-a67f-5b8951faeef5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Instance network_info: |[{"id": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "address": "fa:16:3e:ac:15:e8", "network": {"id": "048c8620-4162-41b0-baec-2d18d63f2ed8", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-711623497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f188046-7a", "ovs_interfaceid": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2163}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Start _get_guest_xml network_info=[{"id": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "address": "fa:16:3e:ac:15:e8", "network": {"id": "048c8620-4162-41b0-baec-2d18d63f2ed8", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-711623497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f188046-7a", "ovs_interfaceid": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'image_id': 'e8633b10-b98a-4580-90f8-3091ca40fa29'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8192}} Mai 07 19:38:21 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='e8633b10-b98a-4580-90f8-3091ca40fa29', instance_meta=NovaInstanceMeta(name='tempest-device-tagging-server-1300705151', uuid='b2741568-56f8-41e0-a67f-5b8951faeef5'), owner=OwnerMeta(userid='21cc0090dbde493eb6cf6ce289991c0f', username='tempest-TaggedAttachmentsTest-239580738-project-member', projectid='c4d37f2809da402f9ff2f13e7afa7372', projectname='tempest-TaggedAttachmentsTest-239580738'), image=ImageMeta(id='e8633b10-b98a-4580-90f8-3091ca40fa29', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "address": "fa:16:3e:ac:15:e8", "network": {"id": "048c8620-4162-41b0-baec-2d18d63f2ed8", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-711623497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f188046-7a", "ovs_interfaceid": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175501.962862) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Searching host: 'devstack' for CPU controller through CGroups V1... {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1783}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] CPU controller missing on host. {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1793}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Searching host: 'devstack' for CPU controller through CGroups V2... {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1802}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] CPU controller found on host. {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1809}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Getting desirable topologies for flavor Flavor(created_at=2026-05-07T17:26:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=192,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=), allow threads: True {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:703}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Flavor limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:488}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Image limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:492}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Flavor pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:528}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Image pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:532}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:570}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:709}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:611}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Got 1 possible topologies {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:641}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:715}} Mai 07 19:38:21 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:717}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-05-07T17:38:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1300705151',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-device-tagging-server-1300705151',id=14,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDKD8Vpl4njOm4xqBb+sXv53N4PvGSB1MN/VwWG8pWG9a2+Dq9PgXsVA3B1IWStlC5rKiXUNab63OAul7RQXaq+YEV2O5UjISdIqUjRa7gCmodUctU0V+y09ZxaU9LZ1QA==',key_name='tempest-keypair-1027236808',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4d37f2809da402f9ff2f13e7afa7372',ramdisk_id='',reservation_id='r-q3qsjhpd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-TaggedAttachmentsTest-239580738',owner_user_name='tempest-TaggedAttachmentsTest-239580738-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:38:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='21cc0090dbde493eb6cf6ce289991c0f',uuid=b2741568-56f8-41e0-a67f-5b8951faeef5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "address": "fa:16:3e:ac:15:e8", "network": {"id": "048c8620-4162-41b0-baec-2d18d63f2ed8", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-711623497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f188046-7a", "ovs_interfaceid": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Converting VIF {"id": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "address": "fa:16:3e:ac:15:e8", "network": {"id": "048c8620-4162-41b0-baec-2d18d63f2ed8", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-711623497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f188046-7a", "ovs_interfaceid": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:15:e8,bridge_name='br-int',has_traffic_filtering=True,id=8f188046-7a8f-4a3e-aae7-85f17ad61655,network=Network(048c8620-4162-41b0-baec-2d18d63f2ed8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f188046-7a') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lazy-loading 'pci_devices' on Instance uuid b2741568-56f8-41e0-a67f-5b8951faeef5 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7fd6c200-2258-480e-8878-8b2f994b8859 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.193s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] End _get_guest_xml xml= Mai 07 19:38:22 devstack nova-compute[86443]: b2741568-56f8-41e0-a67f-5b8951faeef5 Mai 07 19:38:22 devstack nova-compute[86443]: instance-0000000e Mai 07 19:38:22 devstack nova-compute[86443]: 196608 Mai 07 19:38:22 devstack nova-compute[86443]: 1 Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: tempest-device-tagging-server-1300705151 Mai 07 19:38:22 devstack nova-compute[86443]: 2026-05-07 17:38:21 Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: 192 Mai 07 19:38:22 devstack nova-compute[86443]: 1 Mai 07 19:38:22 devstack nova-compute[86443]: 0 Mai 07 19:38:22 devstack nova-compute[86443]: 0 Mai 07 19:38:22 devstack nova-compute[86443]: 1 Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: True Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: bare Mai 07 19:38:22 devstack nova-compute[86443]: qcow2 Mai 07 19:38:22 devstack nova-compute[86443]: 1 Mai 07 19:38:22 devstack nova-compute[86443]: 0 Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: virtio Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: tempest-TaggedAttachmentsTest-239580738-project-member Mai 07 19:38:22 devstack nova-compute[86443]: tempest-TaggedAttachmentsTest-239580738 Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: OpenStack Foundation Mai 07 19:38:22 devstack nova-compute[86443]: OpenStack Nova Mai 07 19:38:22 devstack nova-compute[86443]: 33.1.0 Mai 07 19:38:22 devstack nova-compute[86443]: b2741568-56f8-41e0-a67f-5b8951faeef5 Mai 07 19:38:22 devstack nova-compute[86443]: b2741568-56f8-41e0-a67f-5b8951faeef5 Mai 07 19:38:22 devstack nova-compute[86443]: Virtual Machine Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: hvm Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: 1 Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: /dev/urandom Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: Mai 07 19:38:22 devstack nova-compute[86443]: {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8199}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Preparing to wait for external event network-vif-plugged-8f188046-7a8f-4a3e-aae7-85f17ad61655 {{(pid=86443) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:306}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Acquiring lock "b2741568-56f8-41e0-a67f-5b8951faeef5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-05-07T17:38:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1300705151',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-device-tagging-server-1300705151',id=14,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDKD8Vpl4njOm4xqBb+sXv53N4PvGSB1MN/VwWG8pWG9a2+Dq9PgXsVA3B1IWStlC5rKiXUNab63OAul7RQXaq+YEV2O5UjISdIqUjRa7gCmodUctU0V+y09ZxaU9LZ1QA==',key_name='tempest-keypair-1027236808',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4d37f2809da402f9ff2f13e7afa7372',ramdisk_id='',reservation_id='r-q3qsjhpd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-TaggedAttachmentsTest-239580738',owner_user_name='tempest-TaggedAttachmentsTest-239580738-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:38:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='21cc0090dbde493eb6cf6ce289991c0f',uuid=b2741568-56f8-41e0-a67f-5b8951faeef5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "address": "fa:16:3e:ac:15:e8", "network": {"id": "048c8620-4162-41b0-baec-2d18d63f2ed8", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-711623497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f188046-7a", "ovs_interfaceid": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) plug /opt/stack/nova/nova/virt/libvirt/vif.py:763}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Converting VIF {"id": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "address": "fa:16:3e:ac:15:e8", "network": {"id": "048c8620-4162-41b0-baec-2d18d63f2ed8", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-711623497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f188046-7a", "ovs_interfaceid": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:15:e8,bridge_name='br-int',has_traffic_filtering=True,id=8f188046-7a8f-4a3e-aae7-85f17ad61655,network=Network(048c8620-4162-41b0-baec-2d18d63f2ed8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f188046-7a') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG os_vif [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:15:e8,bridge_name='br-int',has_traffic_filtering=True,id=8f188046-7a8f-4a3e-aae7-85f17ad61655,network=Network(048c8620-4162-41b0-baec-2d18d63f2ed8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f188046-7a') {{(pid=86443) plug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:76}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '1df70c63-71ab-5463-a70d-ba6213a87b8e', '_type': 'linux-noop'}}, row=False) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f188046-7a, may_exist=True, interface_attrs={}) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap8f188046-7a, col_values=(('qos', UUID('2096cb78-b5f4-47a8-abfc-a24609f86f6d')),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap8f188046-7a, col_values=(('external_ids', {'iface-id': '8f188046-7a8f-4a3e-aae7-85f17ad61655', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:15:e8', 'vm-uuid': 'b2741568-56f8-41e0-a67f-5b8951faeef5'}),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:38:22 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:22 devstack nova-compute[86443]: INFO os_vif [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:15:e8,bridge_name='br-int',has_traffic_filtering=True,id=8f188046-7a8f-4a3e-aae7-85f17ad61655,network=Network(048c8620-4162-41b0-baec-2d18d63f2ed8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f188046-7a') Mai 07 19:38:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:38:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] No BDM found with device name hda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:38:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] No VIF found with MAC fa:16:3e:ac:15:e8, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:38:24 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Using config drive Mai 07 19:38:24 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-11cce474-7aaf-4f3b-9e8c-680852fa00ea req-3feebc03-ba03-4e05-9202-96fe917a530f service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Received event network-changed-e4bf19ba-d2b9-4920-a784-c31b806fc92a {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:38:24 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-11cce474-7aaf-4f3b-9e8c-680852fa00ea req-3feebc03-ba03-4e05-9202-96fe917a530f service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Refreshing instance network info cache due to event network-changed-e4bf19ba-d2b9-4920-a784-c31b806fc92a. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:38:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-11cce474-7aaf-4f3b-9e8c-680852fa00ea req-3feebc03-ba03-4e05-9202-96fe917a530f service nova] Acquiring lock "refresh_cache-17b05464-04bc-4019-8dfd-e4dd51eed233" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:38:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-11cce474-7aaf-4f3b-9e8c-680852fa00ea req-3feebc03-ba03-4e05-9202-96fe917a530f service nova] Acquired lock "refresh_cache-17b05464-04bc-4019-8dfd-e4dd51eed233" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:38:24 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-11cce474-7aaf-4f3b-9e8c-680852fa00ea req-3feebc03-ba03-4e05-9202-96fe917a530f service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Refreshing network info cache for port e4bf19ba-d2b9-4920-a784-c31b806fc92a {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:38:24 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:24 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Creating config drive at /opt/stack/data/nova/instances/b2741568-56f8-41e0-a67f-5b8951faeef5/disk.config Mai 07 19:38:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/b2741568-56f8-41e0-a67f-5b8951faeef5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 33.1.0 -quiet -J -r -V config-2 /tmp/tmp9hv0zdrd {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:38:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/b2741568-56f8-41e0-a67f-5b8951faeef5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 33.1.0 -quiet -J -r -V config-2 /tmp/tmp9hv0zdrd" returned: 0 in 0.115s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:38:25 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:25 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:25 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-11cce474-7aaf-4f3b-9e8c-680852fa00ea req-3feebc03-ba03-4e05-9202-96fe917a530f service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:26 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-3750fb9a-0e4b-4e55-8c29-3c71208e644e req-19cdb808-02a5-4b6b-89fb-2f21a083d2c8 service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Received event network-vif-plugged-8f188046-7a8f-4a3e-aae7-85f17ad61655 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:38:26 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-3750fb9a-0e4b-4e55-8c29-3c71208e644e req-19cdb808-02a5-4b6b-89fb-2f21a083d2c8 service nova] Acquiring lock "b2741568-56f8-41e0-a67f-5b8951faeef5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:26 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-3750fb9a-0e4b-4e55-8c29-3c71208e644e req-19cdb808-02a5-4b6b-89fb-2f21a083d2c8 service nova] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:26 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-3750fb9a-0e4b-4e55-8c29-3c71208e644e req-19cdb808-02a5-4b6b-89fb-2f21a083d2c8 service nova] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:26 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-3750fb9a-0e4b-4e55-8c29-3c71208e644e req-19cdb808-02a5-4b6b-89fb-2f21a083d2c8 service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Processing event network-vif-plugged-8f188046-7a8f-4a3e-aae7-85f17ad61655 {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11746}} Mai 07 19:38:26 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-11cce474-7aaf-4f3b-9e8c-680852fa00ea req-3feebc03-ba03-4e05-9202-96fe917a530f service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:26 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:26 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:26 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:26 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:27 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=86443) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:601}} Mai 07 19:38:27 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Started> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:38:27 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] VM Started (Lifecycle Event) Mai 07 19:38:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Guest created on hypervisor {{(pid=86443) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4893}} Mai 07 19:38:27 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-11cce474-7aaf-4f3b-9e8c-680852fa00ea req-3feebc03-ba03-4e05-9202-96fe917a530f service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Updated VIF entry in instance network info cache for port e4bf19ba-d2b9-4920-a784-c31b806fc92a. {{(pid=86443) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3521}} Mai 07 19:38:27 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-11cce474-7aaf-4f3b-9e8c-680852fa00ea req-3feebc03-ba03-4e05-9202-96fe917a530f service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Updating instance_info_cache with network_info: [{"id": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "address": "fa:16:3e:71:2a:5b", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.156", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4bf19ba-d2", "ovs_interfaceid": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:38:27 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Instance spawned successfully. Mai 07 19:38:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1012}} Mai 07 19:38:27 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:27 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:38:27 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:38:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-11cce474-7aaf-4f3b-9e8c-680852fa00ea req-3feebc03-ba03-4e05-9202-96fe917a530f service nova] Releasing lock "refresh_cache-17b05464-04bc-4019-8dfd-e4dd51eed233" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:38:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Found default for hw_cdrom_bus of ide {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:38:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Found default for hw_disk_bus of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:38:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Found default for hw_input_bus of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:38:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Found default for hw_pointer_model of None {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:38:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Found default for hw_video_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:38:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Found default for hw_vif_model of virtio {{(pid=86443) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:1041}} Mai 07 19:38:28 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] During sync_power_state the instance has a pending task (spawning). Skip. Mai 07 19:38:28 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Paused> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:38:28 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] VM Paused (Lifecycle Event) Mai 07 19:38:28 devstack nova-compute[86443]: INFO nova.compute.manager [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Took 10.18 seconds to spawn the instance on the hypervisor. Mai 07 19:38:28 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:38:28 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-935cdfd5-f7c8-4bd0-b36a-85e9b6d1a278 req-ae760be4-59d2-4513-b8c7-f98d6dd1bf7f service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Received event network-vif-plugged-8f188046-7a8f-4a3e-aae7-85f17ad61655 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:38:28 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-935cdfd5-f7c8-4bd0-b36a-85e9b6d1a278 req-ae760be4-59d2-4513-b8c7-f98d6dd1bf7f service nova] Acquiring lock "b2741568-56f8-41e0-a67f-5b8951faeef5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:28 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-935cdfd5-f7c8-4bd0-b36a-85e9b6d1a278 req-ae760be4-59d2-4513-b8c7-f98d6dd1bf7f service nova] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.007s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:28 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-935cdfd5-f7c8-4bd0-b36a-85e9b6d1a278 req-ae760be4-59d2-4513-b8c7-f98d6dd1bf7f service nova] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:28 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-935cdfd5-f7c8-4bd0-b36a-85e9b6d1a278 req-ae760be4-59d2-4513-b8c7-f98d6dd1bf7f service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] No waiting events found dispatching network-vif-plugged-8f188046-7a8f-4a3e-aae7-85f17ad61655 {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:38:28 devstack nova-compute[86443]: WARNING nova.compute.manager [req-935cdfd5-f7c8-4bd0-b36a-85e9b6d1a278 req-ae760be4-59d2-4513-b8c7-f98d6dd1bf7f service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Received unexpected event network-vif-plugged-8f188046-7a8f-4a3e-aae7-85f17ad61655 for instance with vm_state building and task_state spawning. Mai 07 19:38:28 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:38:28 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Emitting event Resumed> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:38:28 devstack nova-compute[86443]: INFO nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] VM Resumed (Lifecycle Event) Mai 07 19:38:28 devstack nova-compute[86443]: INFO nova.compute.manager [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Took 15.55 seconds to build instance. Mai 07 19:38:29 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:38:29 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=86443) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1540}} Mai 07 19:38:29 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-98d99ddb-9b9d-4d8e-ad12-5552109b90c5 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.085s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:31 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-298f1d0a-d4b0-4af2-b3b6-3ebf9cc65252 req-05f31685-c9e6-46e5-864a-704834109075 service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Received event network-changed-8f188046-7a8f-4a3e-aae7-85f17ad61655 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:38:31 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-298f1d0a-d4b0-4af2-b3b6-3ebf9cc65252 req-05f31685-c9e6-46e5-864a-704834109075 service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Refreshing instance network info cache due to event network-changed-8f188046-7a8f-4a3e-aae7-85f17ad61655. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:38:31 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-298f1d0a-d4b0-4af2-b3b6-3ebf9cc65252 req-05f31685-c9e6-46e5-864a-704834109075 service nova] Acquiring lock "refresh_cache-b2741568-56f8-41e0-a67f-5b8951faeef5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:38:31 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-298f1d0a-d4b0-4af2-b3b6-3ebf9cc65252 req-05f31685-c9e6-46e5-864a-704834109075 service nova] Acquired lock "refresh_cache-b2741568-56f8-41e0-a67f-5b8951faeef5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:38:31 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-298f1d0a-d4b0-4af2-b3b6-3ebf9cc65252 req-05f31685-c9e6-46e5-864a-704834109075 service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Refreshing network info cache for port 8f188046-7a8f-4a3e-aae7-85f17ad61655 {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:38:31 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:32 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-298f1d0a-d4b0-4af2-b3b6-3ebf9cc65252 req-05f31685-c9e6-46e5-864a-704834109075 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:32 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:32 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-298f1d0a-d4b0-4af2-b3b6-3ebf9cc65252 req-05f31685-c9e6-46e5-864a-704834109075 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:33 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-298f1d0a-d4b0-4af2-b3b6-3ebf9cc65252 req-05f31685-c9e6-46e5-864a-704834109075 service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Updated VIF entry in instance network info cache for port 8f188046-7a8f-4a3e-aae7-85f17ad61655. {{(pid=86443) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3521}} Mai 07 19:38:33 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-298f1d0a-d4b0-4af2-b3b6-3ebf9cc65252 req-05f31685-c9e6-46e5-864a-704834109075 service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Updating instance_info_cache with network_info: [{"id": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "address": "fa:16:3e:ac:15:e8", "network": {"id": "048c8620-4162-41b0-baec-2d18d63f2ed8", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-711623497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.42", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f188046-7a", "ovs_interfaceid": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:38:33 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-298f1d0a-d4b0-4af2-b3b6-3ebf9cc65252 req-05f31685-c9e6-46e5-864a-704834109075 service nova] Releasing lock "refresh_cache-b2741568-56f8-41e0-a67f-5b8951faeef5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:38:34 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:36 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:37 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:41 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:42 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:44 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:38:44 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=86443) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:11402}} Mai 07 19:38:44 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 0.99 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:38:45 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:38:45 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 0.00 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:38:45 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:38:45 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 1.00 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:38:46 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:38:46 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:38:46 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 2.00 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:38:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "17b05464-04bc-4019-8dfd-e4dd51eed233" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:46 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:46 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:47 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lazy-loading 'flavor' on Instance uuid 17b05464-04bc-4019-8dfd-e4dd51eed233 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:38:47 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:48 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager.update_available_resource {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:38:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 1.523s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:48 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:48 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Auditing locally available compute resources for devstack (node: devstack) {{(pid=86443) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:937}} Mai 07 19:38:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b2741568-56f8-41e0-a67f-5b8951faeef5/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:38:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "17b05464-04bc-4019-8dfd-e4dd51eed233" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:49 devstack nova-compute[86443]: INFO nova.compute.manager [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Attaching volume 54d581ee-3c22-4ba6-9249-75c567e558a6 to /dev/vdb Mai 07 19:38:49 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.166', 'multipath': False, 'enforce_multipath': True, 'host': 'devstack', 'execute': None}" {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:175}} Mai 07 19:38:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b2741568-56f8-41e0-a67f-5b8951faeef5/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:38:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b2741568-56f8-41e0-a67f-5b8951faeef5/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:38:49 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.scaleio [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Failed to query sdc guid: [Errno 2] No such file or directory {{(pid=86443) _get_guid /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/scaleio.py:91}} Mai 07 19:38:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): nvme version {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:38:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] 'nvme version' failed. Not Retrying. {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:543}} Mai 07 19:38:49 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.nvmeof [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] nvme not present on system {{(pid=86443) nvme_present /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/nvmeof.py:782}} Mai 07 19:38:49 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] LIGHTOS: [Errno 111] Connection refused {{(pid=86443) find_dsc /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:161}} Mai 07 19:38:49 devstack nova-compute[86443]: INFO os_brick.initiator.connectors.lightos [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Current host hostNQN and IP(s) are ['192.168.122.166', 'fe80::5054:ff:fe02:c903', '172.24.4.1', '2001:db8::2', 'fe80::e4ed:27ff:fed0:dc45', 'fe80::fc16:3eff:fe71:2a5b', 'fe80::4885:dff:fefa:58b3', 'fe80::fc16:3eff:feac:15e8', 'fe80::88b2:9aff:fead:e331'] Mai 07 19:38:49 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] LIGHTOS: did not find dsc, continuing anyway. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:136}} Mai 07 19:38:49 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] LIGHTOS: no hostnqn found. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:145}} Mai 07 19:38:49 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] <== get_connector_properties: return (120ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.166', 'host': 'devstack', 'multipath': False, 'enforce_multipath': True, 'initiator': 'iqn.2016-04.com.open-iscsi:ab61f14d7e3e', 'do_local_attach': False, 'uuid': 'e51d0ed5-0776-4376-a81f-1e084ffcb1c6', 'system uuid': '1edef36a-6b3a-4b67-b01c-d6a682c117a8', 'nvme_native_multipath': False} {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:202}} Mai 07 19:38:49 devstack nova-compute[86443]: DEBUG nova.virt.block_device [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Updating existing volume attachment record: b923aa89-01b6-4762-9834-a65a07be0075 {{(pid=86443) _volume_attach /opt/stack/nova/nova/virt/block_device.py:666}} Mai 07 19:38:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b2741568-56f8-41e0-a67f-5b8951faeef5/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:38:49 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/17b05464-04bc-4019-8dfd-e4dd51eed233/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:38:50 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/17b05464-04bc-4019-8dfd-e4dd51eed233/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:38:50 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/17b05464-04bc-4019-8dfd-e4dd51eed233/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:38:50 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/17b05464-04bc-4019-8dfd-e4dd51eed233/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:38:50 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:38:50 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running cmd (subprocess): env LANG=C uptime {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:38:50 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] CMD "env LANG=C uptime" returned: 0 in 0.039s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:38:50 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Hypervisor/Node resource view: name=devstack free_ram=4870MB free_disk=14.667057037353516GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_00_0", "address": "0000:02:00.0", "product_id": "000d", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000d", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1111", "vendor_id": "1234", "numa_node": null, "label": "label_1234_1111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1043", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1043", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] {{(pid=86443) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1136}} Mai 07 19:38:50 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:50 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:51 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Instance 17b05464-04bc-4019-8dfd-e4dd51eed233 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}. {{(pid=86443) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1740}} Mai 07 19:38:51 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Instance b2741568-56f8-41e0-a67f-5b8951faeef5 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}. {{(pid=86443) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1740}} Mai 07 19:38:51 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Total usable vcpus: 4, total allocated vcpus: 2 {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1159}} Mai 07 19:38:51 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Final resource view: name=devstack phys_ram=11961MB used_ram=896MB phys_disk=25GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:38:50 up 38 min, 1 user, load average: 6.48, 6.32, 4.01\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_b1ea2fed9f654419a4de1a6168d279ab': '1', 'io_workload': '0', 'num_proj_c4d37f2809da402f9ff2f13e7afa7372': '1'} {{(pid=86443) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1168}} Mai 07 19:38:51 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:38:51 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:38:51 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:51 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:38:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Acquiring lock "interface-b2741568-56f8-41e0-a67f-5b8951faeef5-None" by "nova.compute.manager.ComputeManager.attach_interface..do_attach_interface" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "interface-b2741568-56f8-41e0-a67f-5b8951faeef5-None" acquired by "nova.compute.manager.ComputeManager.attach_interface..do_attach_interface" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:52 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lazy-loading 'flavor' on Instance uuid b2741568-56f8-41e0-a67f-5b8951faeef5 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:38:52 devstack nova-compute[86443]: DEBUG nova.compute.resource_tracker [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Compute_service record updated for devstack:devstack {{(pid=86443) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1097}} Mai 07 19:38:52 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.235s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:52 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 1.99 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:38:52 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:52 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:52 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lazy-loading 'pci_requests' on Instance uuid b2741568-56f8-41e0-a67f-5b8951faeef5 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:38:53 devstack nova-compute[86443]: DEBUG nova.objects.base [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Object Instance lazy-loaded attributes: flavor,pci_requests {{(pid=86443) wrapper /opt/stack/nova/nova/objects/base.py:136}} Mai 07 19:38:53 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] allocate_for_instance() {{(pid=86443) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1187}} Mai 07 19:38:53 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:53 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:53 devstack nova-compute[86443]: DEBUG nova.policy [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '21cc0090dbde493eb6cf6ce289991c0f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4d37f2809da402f9ff2f13e7afa7372', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=86443) authorize /opt/stack/nova/nova/policy.py:192}} Mai 07 19:38:54 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:38:54 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:38:54 devstack nova-compute[86443]: DEBUG oslo_service.periodic_task [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=86443) run_periodic_tasks /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/periodic_task.py:210}} Mai 07 19:38:54 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [None req-6e34fa0a-7fc0-4c62-976a-3f482e11bd7b None None] Dynamic interval looping call 'nova.service.Service.periodic_tasks' sleeping for 51.57 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:38:54 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Successfully created port: 16800502-f6c4-482f-ac27-a04ec6e0393a {{(pid=86443) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:529}} Mai 07 19:38:54 devstack nova-compute[86443]: DEBUG oslo.service.backend._threading.loopingcall [-] Fixed interval looping call 'nova.servicegroup.drivers.db.DbDriver._report_state' sleeping for 119.50 seconds {{(pid=86443) _run_loop /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_threading/loopingcall.py:125}} Mai 07 19:38:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:55 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.volume.quobyte [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] systemd detected. {{(pid=86443) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/quobyte.py:167}} Mai 07 19:38:55 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.volume.quobyte [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Mounting volume osci02.corp.quobyte.com/cinder-vol-1d971a4c-1ce1-46c7-a94d-347e695e16aa at mount point /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22 via systemd-run {{(pid=86443) mount_volume /opt/stack/nova/nova/virt/libvirt/volume/quobyte.py:79}} Mai 07 19:38:55 devstack nova-compute[86443]: INFO nova.virt.libvirt.volume.quobyte [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Mounted volume: osci02.corp.quobyte.com/cinder-vol-1d971a4c-1ce1-46c7-a94d-347e695e16aa Mai 07 19:38:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: held 0.218s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:55 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lazy-loading 'flavor' on Instance uuid 17b05464-04bc-4019-8dfd-e4dd51eed233 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:38:55 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Successfully updated port: 16800502-f6c4-482f-ac27-a04ec6e0393a {{(pid=86443) _update_port /opt/stack/nova/nova/network/neutron.py:567}} Mai 07 19:38:55 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-1aec2fee-cfb8-42e1-8298-ed83ece97e22 req-5d6ec227-084f-4e79-96a5-bad29a464ff6 service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Received event network-changed-16800502-f6c4-482f-ac27-a04ec6e0393a {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:38:55 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-1aec2fee-cfb8-42e1-8298-ed83ece97e22 req-5d6ec227-084f-4e79-96a5-bad29a464ff6 service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Refreshing instance network info cache due to event network-changed-16800502-f6c4-482f-ac27-a04ec6e0393a. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:38:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-1aec2fee-cfb8-42e1-8298-ed83ece97e22 req-5d6ec227-084f-4e79-96a5-bad29a464ff6 service nova] Acquiring lock "refresh_cache-b2741568-56f8-41e0-a67f-5b8951faeef5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:38:55 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-1aec2fee-cfb8-42e1-8298-ed83ece97e22 req-5d6ec227-084f-4e79-96a5-bad29a464ff6 service nova] Acquired lock "refresh_cache-b2741568-56f8-41e0-a67f-5b8951faeef5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:38:55 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-1aec2fee-cfb8-42e1-8298-ed83ece97e22 req-5d6ec227-084f-4e79-96a5-bad29a464ff6 service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Refreshing network info cache for port 16800502-f6c4-482f-ac27-a04ec6e0393a {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:38:55 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] attach device xml: Mai 07 19:38:55 devstack nova-compute[86443]: Mai 07 19:38:55 devstack nova-compute[86443]: Mai 07 19:38:55 devstack nova-compute[86443]: Mai 07 19:38:55 devstack nova-compute[86443]: Mai 07 19:38:55 devstack nova-compute[86443]: 54d581ee-3c22-4ba6-9249-75c567e558a6 Mai 07 19:38:55 devstack nova-compute[86443]: Mai 07 19:38:55 devstack nova-compute[86443]: {{(pid=86443) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:351}} Mai 07 19:38:56 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Acquiring lock "refresh_cache-b2741568-56f8-41e0-a67f-5b8951faeef5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:38:56 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-1aec2fee-cfb8-42e1-8298-ed83ece97e22 req-5d6ec227-084f-4e79-96a5-bad29a464ff6 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:56 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-1aec2fee-cfb8-42e1-8298-ed83ece97e22 req-5d6ec227-084f-4e79-96a5-bad29a464ff6 service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:56 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-1aec2fee-cfb8-42e1-8298-ed83ece97e22 req-5d6ec227-084f-4e79-96a5-bad29a464ff6 service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Added VIF to instance network info cache for port 16800502-f6c4-482f-ac27-a04ec6e0393a. {{(pid=86443) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3528}} Mai 07 19:38:56 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-1aec2fee-cfb8-42e1-8298-ed83ece97e22 req-5d6ec227-084f-4e79-96a5-bad29a464ff6 service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Updating instance_info_cache with network_info: [{"id": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "address": "fa:16:3e:ac:15:e8", "network": {"id": "048c8620-4162-41b0-baec-2d18d63f2ed8", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-711623497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.42", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f188046-7a", "ovs_interfaceid": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "16800502-f6c4-482f-ac27-a04ec6e0393a", "address": "fa:16:3e:0c:e1:98", "network": {"id": "d500267b-793e-4389-b798-4737dc7c6ecd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-2100798819", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16800502-f6", "ovs_interfaceid": "16800502-f6c4-482f-ac27-a04ec6e0393a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:38:56 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:57 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-1aec2fee-cfb8-42e1-8298-ed83ece97e22 req-5d6ec227-084f-4e79-96a5-bad29a464ff6 service nova] Releasing lock "refresh_cache-b2741568-56f8-41e0-a67f-5b8951faeef5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:38:57 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Acquired lock "refresh_cache-b2741568-56f8-41e0-a67f-5b8951faeef5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:38:57 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:38:57 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:38:57 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] No BDM found with device name vdb, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:38:57 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] No VIF found with MAC fa:16:3e:71:2a:5b, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:38:57 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:57 devstack nova-compute[86443]: WARNING nova.network.neutron [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] d500267b-793e-4389-b798-4737dc7c6ecd already exists in list: networks containing: ['d500267b-793e-4389-b798-4737dc7c6ecd']. ignoring it Mai 07 19:38:57 devstack nova-compute[86443]: WARNING nova.network.neutron [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] 16800502-f6c4-482f-ac27-a04ec6e0393a already exists in list: port_ids containing: ['16800502-f6c4-482f-ac27-a04ec6e0393a']. ignoring it Mai 07 19:38:57 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:58 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Updating instance_info_cache with network_info: [{"id": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "address": "fa:16:3e:ac:15:e8", "network": {"id": "048c8620-4162-41b0-baec-2d18d63f2ed8", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-711623497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.42", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f188046-7a", "ovs_interfaceid": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "16800502-f6c4-482f-ac27-a04ec6e0393a", "address": "fa:16:3e:0c:e1:98", "network": {"id": "d500267b-793e-4389-b798-4737dc7c6ecd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-2100798819", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16800502-f6", "ovs_interfaceid": "16800502-f6c4-482f-ac27-a04ec6e0393a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Releasing lock "refresh_cache-b2741568-56f8-41e0-a67f-5b8951faeef5" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-05-07T17:38:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1300705151',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-device-tagging-server-1300705151',id=14,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDKD8Vpl4njOm4xqBb+sXv53N4PvGSB1MN/VwWG8pWG9a2+Dq9PgXsVA3B1IWStlC5rKiXUNab63OAul7RQXaq+YEV2O5UjISdIqUjRa7gCmodUctU0V+y09ZxaU9LZ1QA==',key_name='tempest-keypair-1027236808',keypairs=,launch_index=0,launched_at=2026-05-07T17:38:28Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=,new_flavor=None,node='devstack',numa_topology=,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c4d37f2809da402f9ff2f13e7afa7372',ramdisk_id='',reservation_id='r-q3qsjhpd',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-TaggedAttachmentsTest-239580738',owner_user_name='tempest-TaggedAttachmentsTest-239580738-project-member'},tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2026-05-07T17:38:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='21cc0090dbde493eb6cf6ce289991c0f',uuid=b2741568-56f8-41e0-a67f-5b8951faeef5,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "16800502-f6c4-482f-ac27-a04ec6e0393a", "address": "fa:16:3e:0c:e1:98", "network": {"id": "d500267b-793e-4389-b798-4737dc7c6ecd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-2100798819", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16800502-f6", "ovs_interfaceid": "16800502-f6c4-482f-ac27-a04ec6e0393a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) plug /opt/stack/nova/nova/virt/libvirt/vif.py:763}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Converting VIF {"id": "16800502-f6c4-482f-ac27-a04ec6e0393a", "address": "fa:16:3e:0c:e1:98", "network": {"id": "d500267b-793e-4389-b798-4737dc7c6ecd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-2100798819", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16800502-f6", "ovs_interfaceid": "16800502-f6c4-482f-ac27-a04ec6e0393a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:e1:98,bridge_name='br-int',has_traffic_filtering=True,id=16800502-f6c4-482f-ac27-a04ec6e0393a,network=Network(d500267b-793e-4389-b798-4737dc7c6ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16800502-f6') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG os_vif [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:e1:98,bridge_name='br-int',has_traffic_filtering=True,id=16800502-f6c4-482f-ac27-a04ec6e0393a,network=Network(d500267b-793e-4389-b798-4737dc7c6ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16800502-f6') {{(pid=86443) plug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:76}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '8da3478b-63e0-514e-93c4-e767eacc509a', '_type': 'linux-noop'}}, row=False) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap16800502-f6, may_exist=True, interface_attrs={}) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap16800502-f6, col_values=(('qos', UUID('c8aabff2-d386-4c97-95ea-13f2771d64e2')),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap16800502-f6, col_values=(('external_ids', {'iface-id': '16800502-f6c4-482f-ac27-a04ec6e0393a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0c:e1:98', 'vm-uuid': 'b2741568-56f8-41e0-a67f-5b8951faeef5'}),), if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:58 devstack nova-compute[86443]: INFO os_vif [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:e1:98,bridge_name='br-int',has_traffic_filtering=True,id=16800502-f6c4-482f-ac27-a04ec6e0393a,network=Network(d500267b-793e-4389-b798-4737dc7c6ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16800502-f6') Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-05-07T17:38:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1300705151',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-device-tagging-server-1300705151',id=14,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDKD8Vpl4njOm4xqBb+sXv53N4PvGSB1MN/VwWG8pWG9a2+Dq9PgXsVA3B1IWStlC5rKiXUNab63OAul7RQXaq+YEV2O5UjISdIqUjRa7gCmodUctU0V+y09ZxaU9LZ1QA==',key_name='tempest-keypair-1027236808',keypairs=,launch_index=0,launched_at=2026-05-07T17:38:28Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=,new_flavor=None,node='devstack',numa_topology=,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c4d37f2809da402f9ff2f13e7afa7372',ramdisk_id='',reservation_id='r-q3qsjhpd',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-TaggedAttachmentsTest-239580738',owner_user_name='tempest-TaggedAttachmentsTest-239580738-project-member'},tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2026-05-07T17:38:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='21cc0090dbde493eb6cf6ce289991c0f',uuid=b2741568-56f8-41e0-a67f-5b8951faeef5,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "16800502-f6c4-482f-ac27-a04ec6e0393a", "address": "fa:16:3e:0c:e1:98", "network": {"id": "d500267b-793e-4389-b798-4737dc7c6ecd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-2100798819", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16800502-f6", "ovs_interfaceid": "16800502-f6c4-482f-ac27-a04ec6e0393a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Converting VIF {"id": "16800502-f6c4-482f-ac27-a04ec6e0393a", "address": "fa:16:3e:0c:e1:98", "network": {"id": "d500267b-793e-4389-b798-4737dc7c6ecd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-2100798819", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16800502-f6", "ovs_interfaceid": "16800502-f6c4-482f-ac27-a04ec6e0393a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:e1:98,bridge_name='br-int',has_traffic_filtering=True,id=16800502-f6c4-482f-ac27-a04ec6e0393a,network=Network(d500267b-793e-4389-b798-4737dc7c6ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16800502-f6') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:38:58 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] attach device xml: Mai 07 19:38:58 devstack nova-compute[86443]: Mai 07 19:38:58 devstack nova-compute[86443]: Mai 07 19:38:58 devstack nova-compute[86443]: Mai 07 19:38:58 devstack nova-compute[86443]: Mai 07 19:38:58 devstack nova-compute[86443]: Mai 07 19:38:58 devstack nova-compute[86443]: {{(pid=86443) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:351}} Mai 07 19:38:59 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:59 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:59 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:59 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-97ae8470-7be3-4a75-8194-e8f536ad06b5 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233" "released" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 9.582s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:59 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-9c875d7d-cbfd-4a38-8b77-b9c2f2eac007 req-2543440e-8adb-4de5-868a-b559c307ff3a service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Received event network-vif-plugged-16800502-f6c4-482f-ac27-a04ec6e0393a {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:38:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-9c875d7d-cbfd-4a38-8b77-b9c2f2eac007 req-2543440e-8adb-4de5-868a-b559c307ff3a service nova] Acquiring lock "b2741568-56f8-41e0-a67f-5b8951faeef5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:38:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-9c875d7d-cbfd-4a38-8b77-b9c2f2eac007 req-2543440e-8adb-4de5-868a-b559c307ff3a service nova] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:38:59 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-9c875d7d-cbfd-4a38-8b77-b9c2f2eac007 req-2543440e-8adb-4de5-868a-b559c307ff3a service nova] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.009s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:38:59 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-9c875d7d-cbfd-4a38-8b77-b9c2f2eac007 req-2543440e-8adb-4de5-868a-b559c307ff3a service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] No waiting events found dispatching network-vif-plugged-16800502-f6c4-482f-ac27-a04ec6e0393a {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:38:59 devstack nova-compute[86443]: WARNING nova.compute.manager [req-9c875d7d-cbfd-4a38-8b77-b9c2f2eac007 req-2543440e-8adb-4de5-868a-b559c307ff3a service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Received unexpected event network-vif-plugged-16800502-f6c4-482f-ac27-a04ec6e0393a for instance with vm_state active and task_state None. Mai 07 19:38:59 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:59 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:59 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:59 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:38:59 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "17b05464-04bc-4019-8dfd-e4dd51eed233" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:00 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233" acquired by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:00 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:39:00 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] No BDM found with device name hda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:39:00 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] No VIF found with MAC fa:16:3e:ac:15:e8, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:39:01 devstack nova-compute[86443]: INFO nova.compute.manager [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Detaching volume 54d581ee-3c22-4ba6-9249-75c567e558a6 Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='e8633b10-b98a-4580-90f8-3091ca40fa29', instance_meta=NovaInstanceMeta(name='tempest-device-tagging-server-1300705151', uuid='b2741568-56f8-41e0-a67f-5b8951faeef5'), owner=OwnerMeta(userid='21cc0090dbde493eb6cf6ce289991c0f', username='tempest-TaggedAttachmentsTest-239580738-project-member', projectid='c4d37f2809da402f9ff2f13e7afa7372', projectname='tempest-TaggedAttachmentsTest-239580738'), image=ImageMeta(id='e8633b10-b98a-4580-90f8-3091ca40fa29', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'ide', 'hw_disk_bus': 'virtio', 'hw_machine_type': 'pc', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "address": "fa:16:3e:ac:15:e8", "network": {"id": "048c8620-4162-41b0-baec-2d18d63f2ed8", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-711623497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.42", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f188046-7a", "ovs_interfaceid": "8f188046-7a8f-4a3e-aae7-85f17ad61655", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "16800502-f6c4-482f-ac27-a04ec6e0393a", "address": "fa:16:3e:0c:e1:98", "network": {"id": "d500267b-793e-4389-b798-4737dc7c6ecd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-2100798819", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16800502-f6", "ovs_interfaceid": "16800502-f6c4-482f-ac27-a04ec6e0393a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175541.1499116) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] set metadata xml: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: tempest-device-tagging-server-1300705151 Mai 07 19:39:01 devstack nova-compute[86443]: 2026-05-07 17:39:01 Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: 192 Mai 07 19:39:01 devstack nova-compute[86443]: 1 Mai 07 19:39:01 devstack nova-compute[86443]: 0 Mai 07 19:39:01 devstack nova-compute[86443]: 0 Mai 07 19:39:01 devstack nova-compute[86443]: 1 Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: True Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: bare Mai 07 19:39:01 devstack nova-compute[86443]: qcow2 Mai 07 19:39:01 devstack nova-compute[86443]: 1 Mai 07 19:39:01 devstack nova-compute[86443]: 0 Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: ide Mai 07 19:39:01 devstack nova-compute[86443]: virtio Mai 07 19:39:01 devstack nova-compute[86443]: pc Mai 07 19:39:01 devstack nova-compute[86443]: virtio Mai 07 19:39:01 devstack nova-compute[86443]: virtio Mai 07 19:39:01 devstack nova-compute[86443]: virtio Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: tempest-TaggedAttachmentsTest-239580738-project-member Mai 07 19:39:01 devstack nova-compute[86443]: tempest-TaggedAttachmentsTest-239580738 Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: {{(pid=86443) set_metadata /opt/stack/nova/nova/virt/libvirt/guest.py:371}} Mai 07 19:39:01 devstack nova-compute[86443]: INFO nova.virt.block_device [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Attempting to driver detach volume 54d581ee-3c22-4ba6-9249-75c567e558a6 from mountpoint /dev/vdb Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Found disk vdb by alias ua-54d581ee-3c22-4ba6-9249-75c567e558a6 {{(pid=86443) _get_guest_disk_device /opt/stack/nova/nova/virt/libvirt/driver.py:2892}} Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Found disk vdb by alias ua-54d581ee-3c22-4ba6-9249-75c567e558a6 {{(pid=86443) _get_guest_disk_device /opt/stack/nova/nova/virt/libvirt/driver.py:2892}} Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Attempting to detach device vdb from instance 17b05464-04bc-4019-8dfd-e4dd51eed233 from the persistent domain config. {{(pid=86443) _detach_from_persistent /opt/stack/nova/nova/virt/libvirt/driver.py:2642}} Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] detach device xml: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: 54d581ee-3c22-4ba6-9249-75c567e558a6 Mai 07 19:39:01 devstack nova-compute[86443]:
Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: {{(pid=86443) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:481}} Mai 07 19:39:01 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Successfully detached device vdb from instance 17b05464-04bc-4019-8dfd-e4dd51eed233 from the persistent domain config. Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] (1/8): Attempting to detach device vdb with device alias ua-54d581ee-3c22-4ba6-9249-75c567e558a6 from instance 17b05464-04bc-4019-8dfd-e4dd51eed233 from the live domain config. {{(pid=86443) _detach_from_live_with_retry /opt/stack/nova/nova/virt/libvirt/driver.py:2676}} Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] detach device xml: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: 54d581ee-3c22-4ba6-9249-75c567e558a6 Mai 07 19:39:01 devstack nova-compute[86443]:
Mai 07 19:39:01 devstack nova-compute[86443]: Mai 07 19:39:01 devstack nova-compute[86443]: {{(pid=86443) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:481}} Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Received event ua-54d581ee-3c22-4ba6-9249-75c567e558a6> from libvirt while the driver is waiting for it; dispatched. {{(pid=86443) emit_event /opt/stack/nova/nova/virt/libvirt/driver.py:2529}} Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Start waiting for the detach event from libvirt for device vdb with device alias ua-54d581ee-3c22-4ba6-9249-75c567e558a6 for instance 17b05464-04bc-4019-8dfd-e4dd51eed233 {{(pid=86443) _detach_from_live_and_wait_for_event /opt/stack/nova/nova/virt/libvirt/driver.py:2756}} Mai 07 19:39:01 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Successfully detached device vdb from instance 17b05464-04bc-4019-8dfd-e4dd51eed233 from the live domain config. Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: held 0.027s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ef30a250-05db-47db-ac04-18a4695414c4 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "interface-b2741568-56f8-41e0-a67f-5b8951faeef5-None" "released" by "nova.compute.manager.ComputeManager.attach_interface..do_attach_interface" :: held 9.423s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-05fe80bb-558e-455c-bcf0-9972ed513788 req-5a621e57-6e61-4330-9708-0df8d1beb307 service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Received event network-vif-plugged-16800502-f6c4-482f-ac27-a04ec6e0393a {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-05fe80bb-558e-455c-bcf0-9972ed513788 req-5a621e57-6e61-4330-9708-0df8d1beb307 service nova] Acquiring lock "b2741568-56f8-41e0-a67f-5b8951faeef5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-05fe80bb-558e-455c-bcf0-9972ed513788 req-5a621e57-6e61-4330-9708-0df8d1beb307 service nova] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-05fe80bb-558e-455c-bcf0-9972ed513788 req-5a621e57-6e61-4330-9708-0df8d1beb307 service nova] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-05fe80bb-558e-455c-bcf0-9972ed513788 req-5a621e57-6e61-4330-9708-0df8d1beb307 service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] No waiting events found dispatching network-vif-plugged-16800502-f6c4-482f-ac27-a04ec6e0393a {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:39:01 devstack nova-compute[86443]: WARNING nova.compute.manager [req-05fe80bb-558e-455c-bcf0-9972ed513788 req-5a621e57-6e61-4330-9708-0df8d1beb307 service nova] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Received unexpected event network-vif-plugged-16800502-f6c4-482f-ac27-a04ec6e0393a for instance with vm_state active and task_state None. Mai 07 19:39:01 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:02 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lazy-loading 'flavor' on Instance uuid 17b05464-04bc-4019-8dfd-e4dd51eed233 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:39:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Acquiring lock "b2741568-56f8-41e0-a67f-5b8951faeef5" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:02 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:03 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lazy-loading 'flavor' on Instance uuid b2741568-56f8-41e0-a67f-5b8951faeef5 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:39:03 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-6bc2a1ad-1724-47aa-91d9-55550e174c3a tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233" "released" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: held 2.500s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:03 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 1.521s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "17b05464-04bc-4019-8dfd-e4dd51eed233" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:04 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:04 devstack nova-compute[86443]: INFO nova.compute.manager [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Terminating instance Mai 07 19:39:04 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Start destroying the instance on the hypervisor. {{(pid=86443) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3332}} Mai 07 19:39:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:04 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:04 devstack nova-compute[86443]: DEBUG nova.utils [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Queued Task(fn=>, remaining_delay=14.999059377999856 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:39:04 devstack nova-compute[86443]: DEBUG nova.utils [-] Received Task(fn=>, remaining_delay=14.996332494999933 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:39:04 devstack nova-compute[86443]: DEBUG nova.utils [-] Waitig for the deadline of Task(fn=>, remaining_delay=14.996014019000086 future=) {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:39:04 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [-] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Instance destroyed successfully. Mai 07 19:39:04 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lazy-loading 'resources' on Instance uuid 17b05464-04bc-4019-8dfd-e4dd51eed233 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-e8c17100-891e-4646-9b37-9d4cdf694005 req-aae97974-6978-459c-8160-ae434326f933 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Received event network-vif-unplugged-e4bf19ba-d2b9-4920-a784-c31b806fc92a {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-e8c17100-891e-4646-9b37-9d4cdf694005 req-aae97974-6978-459c-8160-ae434326f933 service nova] Acquiring lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-e8c17100-891e-4646-9b37-9d4cdf694005 req-aae97974-6978-459c-8160-ae434326f933 service nova] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.008s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-e8c17100-891e-4646-9b37-9d4cdf694005 req-aae97974-6978-459c-8160-ae434326f933 service nova] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-e8c17100-891e-4646-9b37-9d4cdf694005 req-aae97974-6978-459c-8160-ae434326f933 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] No waiting events found dispatching network-vif-unplugged-e4bf19ba-d2b9-4920-a784-c31b806fc92a {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-e8c17100-891e-4646-9b37-9d4cdf694005 req-aae97974-6978-459c-8160-ae434326f933 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Received event network-vif-unplugged-e4bf19ba-d2b9-4920-a784-c31b806fc92a for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:38:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1979295383',display_name='tempest-AttachVolumeNegativeTest-server-1979295383',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumenegativetest-server-1979295383',id=13,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMwPjyPNCSPtoqoqYlAGIuvsr9bwVFpPeklDJeHZSk/RaYM/Sy1EfiGX+0yUBpXXT4QxYcF2uzijc18TPZPHMEt3dMBq14hDEz5D1pUl9mw7SgCScXfQb+m05E9Nyxs1qA==',key_name='tempest-keypair-658850004',keypairs=,launch_index=0,launched_at=2026-05-07T17:38:21Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b1ea2fed9f654419a4de1a6168d279ab',ramdisk_id='',reservation_id='r-s728gv3u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-429871213',owner_user_name='tempest-AttachVolumeNegativeTest-429871213-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2026-05-07T17:38:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d98e17081ef14e01bf76138813a4d56a',uuid=17b05464-04bc-4019-8dfd-e4dd51eed233,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "address": "fa:16:3e:71:2a:5b", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.156", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4bf19ba-d2", "ovs_interfaceid": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:881}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converting VIF {"id": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "address": "fa:16:3e:71:2a:5b", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.156", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4bf19ba-d2", "ovs_interfaceid": "e4bf19ba-d2b9-4920-a784-c31b806fc92a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:71:2a:5b,bridge_name='br-int',has_traffic_filtering=True,id=e4bf19ba-d2b9-4920-a784-c31b806fc92a,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4bf19ba-d2') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG os_vif [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:2a:5b,bridge_name='br-int',has_traffic_filtering=True,id=e4bf19ba-d2b9-4920-a784-c31b806fc92a,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4bf19ba-d2') {{(pid=86443) unplug /opt/stack/data/venv/lib/python3.12/site-packages/os_vif/__init__.py:109}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape4bf19ba-d2, bridge=br-int, if_exists=True) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Acquiring lock "b2741568-56f8-41e0-a67f-5b8951faeef5" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:05 devstack nova-compute[86443]: INFO nova.compute.manager [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Attaching volume 0a8e62b4-a407-4c9e-a446-8a65a08391ba to /dev/vdb Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 11 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e2b6a851-4ded-4c62-b0b4-9b56603b2047) {{(pid=86443) do_commit /opt/stack/data/venv/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:248}} Mai 07 19:39:05 devstack nova-compute[86443]: INFO os_vif [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:2a:5b,bridge_name='br-int',has_traffic_filtering=True,id=e4bf19ba-d2b9-4920-a784-c31b806fc92a,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4bf19ba-d2') Mai 07 19:39:05 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Deleting instance files /opt/stack/data/nova/instances/17b05464-04bc-4019-8dfd-e4dd51eed233_del Mai 07 19:39:05 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Deletion of /opt/stack/data/nova/instances/17b05464-04bc-4019-8dfd-e4dd51eed233_del complete Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.166', 'multipath': False, 'enforce_multipath': True, 'host': 'devstack', 'execute': None}" {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:175}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.scaleio [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Failed to query sdc guid: [Errno 2] No such file or directory {{(pid=86443) _get_guid /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/scaleio.py:91}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Running cmd (subprocess): nvme version {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] 'nvme version' failed. Not Retrying. {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:543}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.nvmeof [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] nvme not present on system {{(pid=86443) nvme_present /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/nvmeof.py:782}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] LIGHTOS: [Errno 111] Connection refused {{(pid=86443) find_dsc /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:161}} Mai 07 19:39:05 devstack nova-compute[86443]: INFO os_brick.initiator.connectors.lightos [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Current host hostNQN and IP(s) are ['192.168.122.166', 'fe80::5054:ff:fe02:c903', '172.24.4.1', '2001:db8::2', 'fe80::e4ed:27ff:fed0:dc45', 'fe80::fc16:3eff:feac:15e8', 'fe80::88b2:9aff:fead:e331', 'fe80::fc16:3eff:fe0c:e198', 'fe80::7c43:7bff:fe89:fd36'] Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] LIGHTOS: did not find dsc, continuing anyway. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:136}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG os_brick.initiator.connectors.lightos [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] LIGHTOS: no hostnqn found. {{(pid=86443) get_connector_properties /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:145}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG os_brick.utils [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] <== get_connector_properties: return (110ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.166', 'host': 'devstack', 'multipath': False, 'enforce_multipath': True, 'initiator': 'iqn.2016-04.com.open-iscsi:ab61f14d7e3e', 'do_local_attach': False, 'uuid': 'e51d0ed5-0776-4376-a81f-1e084ffcb1c6', 'system uuid': '1edef36a-6b3a-4b67-b01c-d6a682c117a8', 'nvme_native_multipath': False} {{(pid=86443) trace_logging_wrapper /opt/stack/data/venv/lib/python3.12/site-packages/os_brick/utils.py:202}} Mai 07 19:39:05 devstack nova-compute[86443]: DEBUG nova.virt.block_device [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Updating existing volume attachment record: b77a0e00-731f-4822-89c3-242a98883865 {{(pid=86443) _volume_attach /opt/stack/nova/nova/virt/block_device.py:666}} Mai 07 19:39:06 devstack nova-compute[86443]: INFO nova.compute.manager [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Took 1.35 seconds to destroy the instance on the hypervisor. Mai 07 19:39:06 devstack nova-compute[86443]: DEBUG oslo.service.backend._common.loopingcall [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=86443) func /opt/stack/data/venv/lib/python3.12/site-packages/oslo_service/backend/_common/loopingcall.py:419}} Mai 07 19:39:06 devstack nova-compute[86443]: DEBUG nova.compute.manager [-] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Deallocating network for instance {{(pid=86443) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2456}} Mai 07 19:39:06 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] deallocate_for_instance() {{(pid=86443) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1842}} Mai 07 19:39:06 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:39:06 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:39:06 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:06 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-53cdbbbf-d7d5-4a82-a49f-f42258dab0e9 req-eeaa9fc2-fb39-4e23-8193-934942acdc49 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Received event network-vif-deleted-e4bf19ba-d2b9-4920-a784-c31b806fc92a {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:39:06 devstack nova-compute[86443]: INFO nova.compute.manager [req-53cdbbbf-d7d5-4a82-a49f-f42258dab0e9 req-eeaa9fc2-fb39-4e23-8193-934942acdc49 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Neutron deleted interface e4bf19ba-d2b9-4920-a784-c31b806fc92a; detaching it from the instance and deleting it from the info cache Mai 07 19:39:06 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-53cdbbbf-d7d5-4a82-a49f-f42258dab0e9 req-eeaa9fc2-fb39-4e23-8193-934942acdc49 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:39:07 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-e0bc7d7b-b7d9-4ba2-90fa-a5ddb44c44da req-9222bacd-ab57-46f1-9543-746bb59bad77 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Received event network-vif-unplugged-e4bf19ba-d2b9-4920-a784-c31b806fc92a {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:39:07 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-e0bc7d7b-b7d9-4ba2-90fa-a5ddb44c44da req-9222bacd-ab57-46f1-9543-746bb59bad77 service nova] Acquiring lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:07 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-e0bc7d7b-b7d9-4ba2-90fa-a5ddb44c44da req-9222bacd-ab57-46f1-9543-746bb59bad77 service nova] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:07 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-e0bc7d7b-b7d9-4ba2-90fa-a5ddb44c44da req-9222bacd-ab57-46f1-9543-746bb59bad77 service nova] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:07 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-e0bc7d7b-b7d9-4ba2-90fa-a5ddb44c44da req-9222bacd-ab57-46f1-9543-746bb59bad77 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] No waiting events found dispatching network-vif-unplugged-e4bf19ba-d2b9-4920-a784-c31b806fc92a {{(pid=86443) pop_instance_event /opt/stack/nova/nova/compute/manager.py:343}} Mai 07 19:39:07 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-e0bc7d7b-b7d9-4ba2-90fa-a5ddb44c44da req-9222bacd-ab57-46f1-9543-746bb59bad77 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Received event network-vif-unplugged-e4bf19ba-d2b9-4920-a784-c31b806fc92a for instance with task_state deleting. {{(pid=86443) _process_instance_event /opt/stack/nova/nova/compute/manager.py:11764}} Mai 07 19:39:07 devstack nova-compute[86443]: DEBUG nova.network.neutron [-] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:39:07 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-53cdbbbf-d7d5-4a82-a49f-f42258dab0e9 req-eeaa9fc2-fb39-4e23-8193-934942acdc49 service nova] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Detach interface failed, port_id=e4bf19ba-d2b9-4920-a784-c31b806fc92a, reason: Instance 17b05464-04bc-4019-8dfd-e4dd51eed233 could not be found. {{(pid=86443) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:11820}} Mai 07 19:39:07 devstack nova-compute[86443]: INFO nova.compute.manager [-] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Took 1.90 seconds to deallocate network for instance. Mai 07 19:39:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:08 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:08 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:39:08 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:39:09 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:39:09 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.122s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:09 devstack nova-compute[86443]: INFO nova.scheduler.client.report [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Deleted allocations for instance 17b05464-04bc-4019-8dfd-e4dd51eed233 Mai 07 19:39:10 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:10 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-ed2b389a-9859-4e83-b799-75bdc67c6203 tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "17b05464-04bc-4019-8dfd-e4dd51eed233" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 6.589s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:11 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:12 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:15 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:16 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: waited 0.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:17 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.volume.quobyte [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] systemd detected. {{(pid=86443) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/quobyte.py:167}} Mai 07 19:39:17 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.volume.quobyte [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Mounting volume osci02.corp.quobyte.com/cinder-vol-1d971a4c-1ce1-46c7-a94d-347e695e16aa at mount point /opt/stack/data/nova/mnt/2e124af688cb66bbd7c0d412252f4b22 via systemd-run {{(pid=86443) mount_volume /opt/stack/nova/nova/virt/libvirt/volume/quobyte.py:79}} Mai 07 19:39:17 devstack nova-compute[86443]: INFO nova.virt.libvirt.volume.quobyte [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Mounted volume: osci02.corp.quobyte.com/cinder-vol-1d971a4c-1ce1-46c7-a94d-347e695e16aa Mai 07 19:39:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.connect_volume" :: held 0.222s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:17 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lazy-loading 'flavor' on Instance uuid b2741568-56f8-41e0-a67f-5b8951faeef5 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:39:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "4fb8f8e9-44e0-4958-8aad-28871328581e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:17 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "4fb8f8e9-44e0-4958-8aad-28871328581e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:18 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] attach device xml: Mai 07 19:39:18 devstack nova-compute[86443]: Mai 07 19:39:18 devstack nova-compute[86443]: Mai 07 19:39:18 devstack nova-compute[86443]: Mai 07 19:39:18 devstack nova-compute[86443]: Mai 07 19:39:18 devstack nova-compute[86443]: 0a8e62b4-a407-4c9e-a446-8a65a08391ba Mai 07 19:39:18 devstack nova-compute[86443]: Mai 07 19:39:18 devstack nova-compute[86443]: {{(pid=86443) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:351}} Mai 07 19:39:18 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Starting instance... {{(pid=86443) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2605}} Mai 07 19:39:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:18 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:18 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=86443) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2630}} Mai 07 19:39:18 devstack nova-compute[86443]: INFO nova.compute.claims [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Claim successful on node devstack Mai 07 19:39:19 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] No BDM found with device name vda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:39:19 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] No BDM found with device name hda, not building metadata. {{(pid=86443) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13207}} Mai 07 19:39:19 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] No VIF found with MAC fa:16:3e:ac:15:e8, not building metadata {{(pid=86443) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:13183}} Mai 07 19:39:19 devstack nova-compute[86443]: DEBUG nova.utils [-] Task(fn=>, remaining_delay=-0.0009702419997665856 future=) submitted to {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:39:19 devstack nova-compute[86443]: DEBUG nova.utils [-] Waiting for the next task {{(pid=86443) _log /opt/stack/nova/nova/utils.py:1500}} Mai 07 19:39:19 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] Emitting event Stopped> {{(pid=86443) emit_event /opt/stack/nova/nova/virt/driver.py:1874}} Mai 07 19:39:19 devstack nova-compute[86443]: INFO nova.compute.manager [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] VM Stopped (Lifecycle Event) Mai 07 19:39:20 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Available memory encrypted slots: amd_sev=0 amd_sev_es=0 {{(pid=86443) _get_memory_encryption_inventories /opt/stack/nova/nova/virt/libvirt/driver.py:9951}} Mai 07 19:39:20 devstack nova-compute[86443]: DEBUG nova.compute.provider_tree [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Inventory has not changed in ProviderTree for provider: cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 {{(pid=86443) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Mai 07 19:39:20 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-bc064e58-43c4-46e0-9312-5567cd637902 tempest-AttachSCSIVolumeTestJSON-1466334568 tempest-AttachSCSIVolumeTestJSON-1466334568-project-member] [instance: 17b05464-04bc-4019-8dfd-e4dd51eed233] Checking state {{(pid=86443) _get_power_state /opt/stack/nova/nova/compute/manager.py:1957}} Mai 07 19:39:20 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:20 devstack nova-compute[86443]: DEBUG nova.scheduler.client.report [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Inventory has not changed for provider cdec9dae-2ed4-4fdf-a972-e5c56ba8b944 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 11961, 'reserved': 512, 'min_unit': 1, 'max_unit': 11961, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 25, 'reserved': 0, 'min_unit': 1, 'max_unit': 25, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86443) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:958}} Mai 07 19:39:21 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.144s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:21 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Start building networks asynchronously for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3003}} Mai 07 19:39:21 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-7cd8c253-6fb5-4a77-ac93-594640cd909e tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5" "released" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 16.002s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:21 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Allocating IP information in the background. {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2148}} Mai 07 19:39:21 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] allocate_for_instance() {{(pid=86443) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1187}} Mai 07 19:39:21 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:39:21 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:39:21 devstack nova-compute[86443]: DEBUG nova.policy [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd98e17081ef14e01bf76138813a4d56a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1ea2fed9f654419a4de1a6168d279ab', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=86443) authorize /opt/stack/nova/nova/policy.py:192}} Mai 07 19:39:21 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:22 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Mai 07 19:39:22 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Start building block device mappings for instance. {{(pid=86443) _build_resources /opt/stack/nova/nova/compute/manager.py:3038}} Mai 07 19:39:23 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Successfully created port: 3f5afc0e-b6d8-47b3-a26f-a95e84624336 {{(pid=86443) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:529}} Mai 07 19:39:23 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Start spawning the instance on the hypervisor. {{(pid=86443) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2811}} Mai 07 19:39:23 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Creating instance directory {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5215}} Mai 07 19:39:23 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Creating image(s) Mai 07 19:39:23 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "/opt/stack/data/nova/instances/4fb8f8e9-44e0-4958-8aad-28871328581e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:23 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "/opt/stack/data/nova/instances/4fb8f8e9-44e0-4958-8aad-28871328581e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:23 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "/opt/stack/data/nova/instances/4fb8f8e9-44e0-4958-8aad-28871328581e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.004s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:23 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:39:23 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:39:23 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:39:23 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.193s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:39:23 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "d8d56ca44922efe85609619d01052c20f44c056a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:23 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:23 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:39:23 devstack nova-compute[86443]: DEBUG oslo_utils.imageutils.format_inspector [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) {{(pid=86443) _process_chunk /opt/stack/data/venv/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1538}} Mai 07 19:39:23 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Acquiring lock "b2741568-56f8-41e0-a67f-5b8951faeef5" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5" acquired by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.158s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/4fb8f8e9-44e0-4958-8aad-28871328581e/disk 1073741824 {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a,backing_fmt=raw /opt/stack/data/nova/instances/4fb8f8e9-44e0-4958-8aad-28871328581e/disk 1073741824" returned: 0 in 0.072s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "d8d56ca44922efe85609619d01052c20f44c056a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.245s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d8d56ca44922efe85609619d01052c20f44c056a --force-share --output=json" returned: 0 in 0.156s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Checking if we can resize image /opt/stack/data/nova/instances/4fb8f8e9-44e0-4958-8aad-28871328581e/disk. size=1073741824 {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:178}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4fb8f8e9-44e0-4958-8aad-28871328581e/disk --force-share --output=json {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:440}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.processutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CMD "/opt/stack/data/venv/bin/python3.12 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4fb8f8e9-44e0-4958-8aad-28871328581e/disk --force-share --output=json" returned: 0 in 0.166s {{(pid=86443) execute /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/processutils.py:468}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG nova.virt.disk.api [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Cannot resize image /opt/stack/data/nova/instances/4fb8f8e9-44e0-4958-8aad-28871328581e/disk to a smaller size. {{(pid=86443) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:184}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Created local disks {{(pid=86443) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:5347}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Ensure instance console log exists: /opt/stack/data/nova/instances/4fb8f8e9-44e0-4958-8aad-28871328581e/console.log {{(pid=86443) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:5094}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:24 devstack nova-compute[86443]: INFO nova.compute.manager [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Detaching volume 0a8e62b4-a407-4c9e-a446-8a65a08391ba Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Successfully updated port: 3f5afc0e-b6d8-47b3-a26f-a95e84624336 {{(pid=86443) _update_port /opt/stack/nova/nova/network/neutron.py:567}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-6203aba9-f049-412d-9855-ff623f19aa5f req-b7de67d9-ac84-4102-82ea-16982f83fcbf service nova] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Received event network-changed-3f5afc0e-b6d8-47b3-a26f-a95e84624336 {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11986}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG nova.compute.manager [req-6203aba9-f049-412d-9855-ff623f19aa5f req-b7de67d9-ac84-4102-82ea-16982f83fcbf service nova] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Refreshing instance network info cache due to event network-changed-3f5afc0e-b6d8-47b3-a26f-a95e84624336. {{(pid=86443) external_instance_event /opt/stack/nova/nova/compute/manager.py:11991}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-6203aba9-f049-412d-9855-ff623f19aa5f req-b7de67d9-ac84-4102-82ea-16982f83fcbf service nova] Acquiring lock "refresh_cache-4fb8f8e9-44e0-4958-8aad-28871328581e" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-6203aba9-f049-412d-9855-ff623f19aa5f req-b7de67d9-ac84-4102-82ea-16982f83fcbf service nova] Acquired lock "refresh_cache-4fb8f8e9-44e0-4958-8aad-28871328581e" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-6203aba9-f049-412d-9855-ff623f19aa5f req-b7de67d9-ac84-4102-82ea-16982f83fcbf service nova] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Refreshing network info cache for port 3f5afc0e-b6d8-47b3-a26f-a95e84624336 {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2046}} Mai 07 19:39:24 devstack nova-compute[86443]: INFO nova.virt.block_device [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] [instance: b2741568-56f8-41e0-a67f-5b8951faeef5] Attempting to driver detach volume 0a8e62b4-a407-4c9e-a446-8a65a08391ba from mountpoint /dev/vdb Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Found disk vdb by alias ua-0a8e62b4-a407-4c9e-a446-8a65a08391ba {{(pid=86443) _get_guest_disk_device /opt/stack/nova/nova/virt/libvirt/driver.py:2892}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Found disk vdb by alias ua-0a8e62b4-a407-4c9e-a446-8a65a08391ba {{(pid=86443) _get_guest_disk_device /opt/stack/nova/nova/virt/libvirt/driver.py:2892}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Attempting to detach device vdb from instance b2741568-56f8-41e0-a67f-5b8951faeef5 from the persistent domain config. {{(pid=86443) _detach_from_persistent /opt/stack/nova/nova/virt/libvirt/driver.py:2642}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] detach device xml: Mai 07 19:39:24 devstack nova-compute[86443]: Mai 07 19:39:24 devstack nova-compute[86443]: Mai 07 19:39:24 devstack nova-compute[86443]: Mai 07 19:39:24 devstack nova-compute[86443]: Mai 07 19:39:24 devstack nova-compute[86443]: 0a8e62b4-a407-4c9e-a446-8a65a08391ba Mai 07 19:39:24 devstack nova-compute[86443]:
Mai 07 19:39:24 devstack nova-compute[86443]: Mai 07 19:39:24 devstack nova-compute[86443]: {{(pid=86443) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:481}} Mai 07 19:39:24 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Successfully detached device vdb from instance b2741568-56f8-41e0-a67f-5b8951faeef5 from the persistent domain config. Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] (1/8): Attempting to detach device vdb with device alias ua-0a8e62b4-a407-4c9e-a446-8a65a08391ba from instance b2741568-56f8-41e0-a67f-5b8951faeef5 from the live domain config. {{(pid=86443) _detach_from_live_with_retry /opt/stack/nova/nova/virt/libvirt/driver.py:2676}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] detach device xml: Mai 07 19:39:24 devstack nova-compute[86443]: Mai 07 19:39:24 devstack nova-compute[86443]: Mai 07 19:39:24 devstack nova-compute[86443]: Mai 07 19:39:24 devstack nova-compute[86443]: Mai 07 19:39:24 devstack nova-compute[86443]: 0a8e62b4-a407-4c9e-a446-8a65a08391ba Mai 07 19:39:24 devstack nova-compute[86443]:
Mai 07 19:39:24 devstack nova-compute[86443]: Mai 07 19:39:24 devstack nova-compute[86443]: {{(pid=86443) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:481}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-701ae442-dc8f-4850-ba4e-584bf55067ad None None] Received event ua-0a8e62b4-a407-4c9e-a446-8a65a08391ba> from libvirt while the driver is waiting for it; dispatched. {{(pid=86443) emit_event /opt/stack/nova/nova/virt/libvirt/driver.py:2529}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Start waiting for the detach event from libvirt for device vdb with device alias ua-0a8e62b4-a407-4c9e-a446-8a65a08391ba for instance b2741568-56f8-41e0-a67f-5b8951faeef5 {{(pid=86443) _detach_from_live_and_wait_for_event /opt/stack/nova/nova/virt/libvirt/driver.py:2756}} Mai 07 19:39:24 devstack nova-compute[86443]: INFO nova.virt.libvirt.driver [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Successfully detached device vdb from instance b2741568-56f8-41e0-a67f-5b8951faeef5 from the live domain config. Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Acquiring lock "connect_qb_volume" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "connect_qb_volume" acquired by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: waited 0.000s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:24 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "connect_qb_volume" "released" by "nova.virt.libvirt.volume.quobyte.LibvirtQuobyteVolumeDriver.disconnect_volume" :: held 0.023s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquiring lock "refresh_cache-4fb8f8e9-44e0-4958-8aad-28871328581e" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:390}} Mai 07 19:39:25 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [req-6203aba9-f049-412d-9855-ff623f19aa5f req-b7de67d9-ac84-4102-82ea-16982f83fcbf service nova] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:39:25 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-6203aba9-f049-412d-9855-ff623f19aa5f req-b7de67d9-ac84-4102-82ea-16982f83fcbf service nova] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:39:25 devstack nova-compute[86443]: DEBUG nova.network.neutron [req-6203aba9-f049-412d-9855-ff623f19aa5f req-b7de67d9-ac84-4102-82ea-16982f83fcbf service nova] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Updating instance_info_cache with network_info: [] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:39:25 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:25 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lazy-loading 'flavor' on Instance uuid b2741568-56f8-41e0-a67f-5b8951faeef5 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:39:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [req-6203aba9-f049-412d-9855-ff623f19aa5f req-b7de67d9-ac84-4102-82ea-16982f83fcbf service nova] Releasing lock "refresh_cache-4fb8f8e9-44e0-4958-8aad-28871328581e" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:39:25 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Acquired lock "refresh_cache-4fb8f8e9-44e0-4958-8aad-28871328581e" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:393}} Mai 07 19:39:25 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Building network info cache for instance {{(pid=86443) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2049}} Mai 07 19:39:26 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-08baeef7-7b31-4420-a9fe-eef169ce4889 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Acquiring lock "interface-b2741568-56f8-41e0-a67f-5b8951faeef5-16800502-f6c4-482f-ac27-a04ec6e0393a" by "nova.compute.manager.ComputeManager.detach_interface..do_detach_interface" {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:506}} Mai 07 19:39:26 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-08baeef7-7b31-4420-a9fe-eef169ce4889 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "interface-b2741568-56f8-41e0-a67f-5b8951faeef5-16800502-f6c4-482f-ac27-a04ec6e0393a" acquired by "nova.compute.manager.ComputeManager.detach_interface..do_detach_interface" :: waited 0.005s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:519}} Mai 07 19:39:26 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Instance cache missing network info. {{(pid=86443) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3362}} Mai 07 19:39:26 devstack nova-compute[86443]: WARNING neutronclient.v2_0.client [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release. Mai 07 19:39:26 devstack nova-compute[86443]: DEBUG nova.network.neutron [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Updating instance_info_cache with network_info: [{"id": "3f5afc0e-b6d8-47b3-a26f-a95e84624336", "address": "fa:16:3e:23:49:96", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f5afc0e-b6", "ovs_interfaceid": "3f5afc0e-b6d8-47b3-a26f-a95e84624336", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=86443) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:104}} Mai 07 19:39:26 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-08baeef7-7b31-4420-a9fe-eef169ce4889 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lazy-loading 'flavor' on Instance uuid b2741568-56f8-41e0-a67f-5b8951faeef5 {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:39:26 devstack nova-compute[86443]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 15 {{(pid=86443) __log_wakeup /opt/stack/data/venv/lib/python3.12/site-packages/ovs/poller.py:263}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-2e3a8dfb-387b-475f-867f-777f515343ef tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Lock "b2741568-56f8-41e0-a67f-5b8951faeef5" "released" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: held 3.096s {{(pid=86443) inner /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:538}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG oslo_concurrency.lockutils [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Releasing lock "refresh_cache-4fb8f8e9-44e0-4958-8aad-28871328581e" {{(pid=86443) lock /opt/stack/data/venv/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:413}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.compute.manager [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Instance network_info: |[{"id": "3f5afc0e-b6d8-47b3-a26f-a95e84624336", "address": "fa:16:3e:23:49:96", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f5afc0e-b6", "ovs_interfaceid": "3f5afc0e-b6d8-47b3-a26f-a95e84624336", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=86443) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:2163}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] [instance: 4fb8f8e9-44e0-4958-8aad-28871328581e] Start _get_guest_xml network_info=[{"id": "3f5afc0e-b6d8-47b3-a26f-a95e84624336", "address": "fa:16:3e:23:49:96", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f5afc0e-b6", "ovs_interfaceid": "3f5afc0e-b6d8-47b3-a26f-a95e84624336", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'image_id': 'e8633b10-b98a-4580-90f8-3091ca40fa29'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None {{(pid=86443) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:8192}} Mai 07 19:39:27 devstack nova-compute[86443]: WARNING nova.virt.libvirt.driver [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.driver [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='e8633b10-b98a-4580-90f8-3091ca40fa29', instance_meta=NovaInstanceMeta(name='tempest-AttachVolumeNegativeTest-server-1179786861', uuid='4fb8f8e9-44e0-4958-8aad-28871328581e'), owner=OwnerMeta(userid='d98e17081ef14e01bf76138813a4d56a', username='tempest-AttachVolumeNegativeTest-429871213-project-member', projectid='b1ea2fed9f654419a4de1a6168d279ab', projectname='tempest-AttachVolumeNegativeTest-429871213'), image=ImageMeta(id='e8633b10-b98a-4580-90f8-3091ca40fa29', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='42', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "3f5afc0e-b6d8-47b3-a26f-a95e84624336", "address": "fa:16:3e:23:49:96", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f5afc0e-b6", "ovs_interfaceid": "3f5afc0e-b6d8-47b3-a26f-a95e84624336", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='33.1.0', creation_time=1778175567.3241777) {{(pid=86443) get_instance_driver_metadata /opt/stack/nova/nova/virt/driver.py:438}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Searching host: 'devstack' for CPU controller through CGroups V1... {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1783}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CPU controller missing on host. {{(pid=86443) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1793}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Searching host: 'devstack' for CPU controller through CGroups V2... {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1802}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.host [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CPU controller found on host. {{(pid=86443) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1809}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] CPU mode 'host-passthrough' models '' was chosen, with extra flags: '' {{(pid=86443) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5886}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Getting desirable topologies for flavor Flavor(created_at=2026-05-07T17:26:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=192,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2026-05-07T17:25:50Z,direct_url=,disk_format='qcow2',id=e8633b10-b98a-4580-90f8-3091ca40fa29,min_disk=0,min_ram=0,name='cirros-0.6.3-x86_64-disk',owner='cf74477e441f4bff8612e7085667831b',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2026-05-07T17:25:52Z,virtual_size=,visibility=), allow threads: True {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:703}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Flavor limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:488}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Image limits 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:492}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Flavor pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:528}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Image pref 0:0:0 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:532}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=86443) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:570}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:709}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:611}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Got 1 possible topologies {{(pid=86443) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:641}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:715}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.hardware [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=86443) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:717}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-05-07T17:39:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1179786861',display_name='tempest-AttachVolumeNegativeTest-server-1179786861',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-attachvolumenegativetest-server-1179786861',id=15,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA1yg2qTlkqQjGTyDmp+tBXQeDdyTIHUCmcdJoxJFm+33PtBOhvmwW9Ybup1FYTGe8zQorws10N0hFUIWihJARINAQAceR2pE6oX+OmSAW65+mA40rL/Cstfjys+eWfcNA==',key_name='tempest-keypair-1378593835',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='devstack',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1ea2fed9f654419a4de1a6168d279ab',ramdisk_id='',reservation_id='r-nvqb7r67',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-429871213',owner_user_name='tempest-AttachVolumeNegativeTest-429871213-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-05-07T17:39:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d98e17081ef14e01bf76138813a4d56a',uuid=4fb8f8e9-44e0-4958-8aad-28871328581e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3f5afc0e-b6d8-47b3-a26f-a95e84624336", "address": "fa:16:3e:23:49:96", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f5afc0e-b6", "ovs_interfaceid": "3f5afc0e-b6d8-47b3-a26f-a95e84624336", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converting VIF {"id": "3f5afc0e-b6d8-47b3-a26f-a95e84624336", "address": "fa:16:3e:23:49:96", "network": {"id": "da444429-bde6-43b2-bcc1-c50e42420cb9", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-962614421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1ea2fed9f654419a4de1a6168d279ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f5afc0e-b6", "ovs_interfaceid": "3f5afc0e-b6d8-47b3-a26f-a95e84624336", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:49:96,bridge_name='br-int',has_traffic_filtering=True,id=3f5afc0e-b6d8-47b3-a26f-a95e84624336,network=Network(da444429-bde6-43b2-bcc1-c50e42420cb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f5afc0e-b6') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.objects.instance [None req-00e69a8b-1a7a-4724-89f1-93c7a4f6100f tempest-AttachVolumeNegativeTest-429871213 tempest-AttachVolumeNegativeTest-429871213-project-member] Lazy-loading 'pci_devices' on Instance uuid 4fb8f8e9-44e0-4958-8aad-28871328581e {{(pid=86443) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1148}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.vif [None req-08baeef7-7b31-4420-a9fe-eef169ce4889 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-05-07T17:38:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=InstanceDeviceMetadata,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1300705151',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='devstack',hostname='tempest-device-tagging-server-1300705151',id=14,image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDKD8Vpl4njOm4xqBb+sXv53N4PvGSB1MN/VwWG8pWG9a2+Dq9PgXsVA3B1IWStlC5rKiXUNab63OAul7RQXaq+YEV2O5UjISdIqUjRa7gCmodUctU0V+y09ZxaU9LZ1QA==',key_name='tempest-keypair-1027236808',keypairs=,launch_index=0,launched_at=2026-05-07T17:38:28Z,launched_on='devstack',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=,new_flavor=None,node='devstack',numa_topology=,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='c4d37f2809da402f9ff2f13e7afa7372',ramdisk_id='',reservation_id='r-q3qsjhpd',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e8633b10-b98a-4580-90f8-3091ca40fa29',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.3-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-TaggedAttachmentsTest-239580738',owner_user_name='tempest-TaggedAttachmentsTest-239580738-project-member'},tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2026-05-07T17:38:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='21cc0090dbde493eb6cf6ce289991c0f',uuid=b2741568-56f8-41e0-a67f-5b8951faeef5,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "16800502-f6c4-482f-ac27-a04ec6e0393a", "address": "fa:16:3e:0c:e1:98", "network": {"id": "d500267b-793e-4389-b798-4737dc7c6ecd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-2100798819", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16800502-f6", "ovs_interfaceid": "16800502-f6c4-482f-ac27-a04ec6e0393a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=86443) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:598}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-08baeef7-7b31-4420-a9fe-eef169ce4889 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Converting VIF {"id": "16800502-f6c4-482f-ac27-a04ec6e0393a", "address": "fa:16:3e:0c:e1:98", "network": {"id": "d500267b-793e-4389-b798-4737dc7c6ecd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-2100798819", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4d37f2809da402f9ff2f13e7afa7372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "ovs_create_tap": false, "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16800502-f6", "ovs_interfaceid": "16800502-f6c4-482f-ac27-a04ec6e0393a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:523}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.network.os_vif_util [None req-08baeef7-7b31-4420-a9fe-eef169ce4889 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:e1:98,bridge_name='br-int',has_traffic_filtering=True,id=16800502-f6c4-482f-ac27-a04ec6e0393a,network=Network(d500267b-793e-4389-b798-4737dc7c6ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16800502-f6') {{(pid=86443) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:560}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-08baeef7-7b31-4420-a9fe-eef169ce4889 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] looking for interface given config: {{(pid=86443) get_interface_by_cfg /opt/stack/nova/nova/virt/libvirt/guest.py:251}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-08baeef7-7b31-4420-a9fe-eef169ce4889 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] looking for interface given config: {{(pid=86443) get_interface_by_cfg /opt/stack/nova/nova/virt/libvirt/guest.py:251}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.driver [None req-08baeef7-7b31-4420-a9fe-eef169ce4889 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] Attempting to detach device tap16800502-f6 from instance b2741568-56f8-41e0-a67f-5b8951faeef5 from the persistent domain config. {{(pid=86443) _detach_from_persistent /opt/stack/nova/nova/virt/libvirt/driver.py:2642}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-08baeef7-7b31-4420-a9fe-eef169ce4889 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] detach device xml: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: {{(pid=86443) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:481}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-08baeef7-7b31-4420-a9fe-eef169ce4889 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] looking for interface given config: {{(pid=86443) get_interface_by_cfg /opt/stack/nova/nova/virt/libvirt/guest.py:251}} Mai 07 19:39:27 devstack nova-compute[86443]: DEBUG nova.virt.libvirt.guest [None req-08baeef7-7b31-4420-a9fe-eef169ce4889 tempest-TaggedAttachmentsTest-239580738 tempest-TaggedAttachmentsTest-239580738-project-member] interface for config: not found in domain: Mai 07 19:39:27 devstack nova-compute[86443]: instance-0000000e Mai 07 19:39:27 devstack nova-compute[86443]: b2741568-56f8-41e0-a67f-5b8951faeef5 Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: tempest-device-tagging-server-1300705151 Mai 07 19:39:27 devstack nova-compute[86443]: 2026-05-07 17:39:01 Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: 192 Mai 07 19:39:27 devstack nova-compute[86443]: 1 Mai 07 19:39:27 devstack nova-compute[86443]: 0 Mai 07 19:39:27 devstack nova-compute[86443]: 0 Mai 07 19:39:27 devstack nova-compute[86443]: 1 Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: True Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: bare Mai 07 19:39:27 devstack nova-compute[86443]: qcow2 Mai 07 19:39:27 devstack nova-compute[86443]: 1 Mai 07 19:39:27 devstack nova-compute[86443]: 0 Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: ide Mai 07 19:39:27 devstack nova-compute[86443]: virtio Mai 07 19:39:27 devstack nova-compute[86443]: pc Mai 07 19:39:27 devstack nova-compute[86443]: virtio Mai 07 19:39:27 devstack nova-compute[86443]: virtio Mai 07 19:39:27 devstack nova-compute[86443]: virtio Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: tempest-TaggedAttachmentsTest-239580738-project-member Mai 07 19:39:27 devstack nova-compute[86443]: tempest-TaggedAttachmentsTest-239580738 Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: 196608 Mai 07 19:39:27 devstack nova-compute[86443]: 196608 Mai 07 19:39:27 devstack nova-compute[86443]: 1 Mai 07 19:39:27 devstack nova-compute[86443]: 1 Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: /machine Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: OpenStack Foundation Mai 07 19:39:27 devstack nova-compute[86443]: OpenStack Nova Mai 07 19:39:27 devstack nova-compute[86443]: 33.1.0 Mai 07 19:39:27 devstack nova-compute[86443]: b2741568-56f8-41e0-a67f-5b8951faeef5 Mai 07 19:39:27 devstack nova-compute[86443]: b2741568-56f8-41e0-a67f-5b8951faeef5 Mai 07 19:39:27 devstack nova-compute[86443]: Virtual Machine Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: hvm Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: destroy Mai 07 19:39:27 devstack nova-compute[86443]: restart Mai 07 19:39:27 devstack nova-compute[86443]: destroy Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: /usr/bin/qemu-system-x86_64 Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]:
Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]:
Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]:
Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]:
Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]:
Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: Mai 07 19:39:27 devstack nova-compute[86443]: