Unable to create new instances due to network error

Solution In Progress - Updated -

Issue

  • When trying to create new VMs, however, after 2 or 3 VMs successfully created, we get an error message related to network allocation: Failed to allocate the network(s), not rescheduling).

  • We applied this solution, however the issue persists. We are not using OVN though. Also, some ports went down on some VMs when trying to migrate them to another host.

  • When creating a VM, we see this error:

| fault                               | {'code': 500, 'created': '2022-09-06T11:30:47Z', 'message': 'Build of instance 4ffb1ccc-bde1-4146-9dfb-65c01befb9a2 aborted: Failed to allocate the network(s), not rescheduling.', 'details': 'Traceback (most recent call last):\n  File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 6510, in _create_domain_and_network\n    network_info)\n  File "/usr/lib64/python3.6/contextlib.py", line 88, in __exit__\n    next(self.gen)\n  File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 478, in wait_for_instance_event\n    actual_event = event.wait()\n  File "/usr/lib/python3.6/site-packages/eventlet/event.py", line 125, in wait\n    result = hub.switch()\n  File "/usr/lib/python3.6/site-packages/eventlet/hubs/hub.py", line 298, in switch\n    return self.greenlet.switch()\neventlet.timeout.Timeout: 300 seconds\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n  File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 2442, in _build_and_run_instance\n    block_device_info=block_device_info)\n  File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 3701, in spawn\n    cleanup_instance_disks=created_disks)\n  File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 6533, in _create_domain_and_network\n    raise exception.VirtualInterfaceCreateException()\nnova.exception.VirtualInterfaceCreateException: Virtual Interface creation failed\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n  File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 2168, in _do_build_and_run_instance\n    filter_properties, request_spec)\n  File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 2508, in _build_and_run_instance\n    reason=msg)\nnova.exception.BuildAbortException: Build of instance 4ffb1ccc-bde1-4146-9dfb-65c01befb9a2 aborted: Failed to allocate the network(s), not rescheduling.\n'} |
  • We also see the following errors in /var/log/containers/neutron/openvswitch-agent.log:
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent Traceback (most recent call last):
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/usr/lib/python3.6/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/ovs_neutron_agent.py", line 2659, in rpc_loop
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     port_info, provisioning_needed)
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/usr/lib/python3.6/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/ovs_neutron_agent.py", line 2108, in process_netwo
rk_ports
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     devices_added_updated, provisioning_needed, re_added))
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/usr/lib/python3.6/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/ovs_neutron_agent.py", line 1989, in treat_devices
_added_or_updated
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     self.ext_manager.handle_port(self.context, details)
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/usr/lib/python3.6/site-packages/neutron/agent/l2/l2_agent_extensions_manager.py", line 42, in handle_port
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     extension.obj.handle_port(context, data)
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py", line 328, in inner
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     return f(*args, **kwargs)
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/usr/lib/python3.6/site-packages/neutron/agent/l2/extensions/qos.py", line 254, in handle_port
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     self._process_reset_port(port)
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/usr/lib/python3.6/site-packages/neutron/agent/l2/extensions/qos.py", line 299, in _process_reset_port
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     self.qos_driver.delete(port)
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/usr/lib/python3.6/site-packages/neutron/agent/l2/extensions/qos.py", line 94, in delete
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     self._handle_rule_delete(port, rule_type)
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/usr/lib/python3.6/site-packages/neutron/agent/l2/extensions/qos.py", line 119, in _handle_rule_delete
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     handler(port)
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/usr/lib/python3.6/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/extension_drivers/qos_driver.py", line 107, in delete_bandwidth_limit
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     self._update_egress_bandwidth_direct(port_name, 'max_tx_rate', 0)
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/usr/lib/python3.6/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/extension_drivers/qos_driver.py", line 210, in _update_egress_bandwidth_direct
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     pr, pf_name = self._get_pr_info(port_name)
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/usr/lib/python3.6/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/extension_drivers/qos_driver.py", line 197, in _get_pr_info
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     pr = priv_utils.devlink_show_port_representor(port_representor_name)
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/usr/lib/python3.6/site-packages/oslo_privsep/priv_context.py", line 245, in _wrap
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     return self.channel.remote_call(name, args, kwargs)
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/usr/lib/python3.6/site-packages/oslo_privsep/daemon.py", line 224, in remote_call
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     raise exc_type(*result[2])
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command.
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent Command: devlink port show vhu3a6dd68f-af -jp
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent Exit code: 1
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent Stdout: '\n'
2022-09-13 18:01:03.398 365359 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent Stderr: 'Netdevice "vhu3a6dd68f-af" not found\n'

Environment

  • Red Hat OpenStack Platform 16.1 (RHOSP)

Subscriber exclusive content

A Red Hat subscription provides unlimited access to our knowledgebase, tools, and much more.

Current Customers and Partners

Log in for full access

Log In

New to Red Hat?

Learn more about Red Hat subscriptions

Using a Red Hat product through a public cloud?

How to access this content