VM creation fails from a volume migrated from Netapp to 3par cinder backend
Issue
-
The requirement is that we need to migrate some cloned volumes from Netapp cinder backend to 3par cinder backend and create VM's using those volumes. Below are the steps followed for one of such volumes and the VM creation gets failed from the cloned and then migrated (Netapp to 3par) volume now in 3par cinder backend.
-
Here are the steps to reproduce this issue:
1) Volume created from image in Netapp cinder backend:
(overcloud) [stack@director ~]$ openstack image list
+--------------------------------------+--------------------------------------+--------+
| ID | Name | Status |
+--------------------------------------+--------------------------------------+--------+
| dbea8855-e725-4812-b0e2-31774c445f40 | bmc_rhel7.6 | active |
| 02b08a79-254f-45fe-bd4e-57e52c21057e | bmc_rhel7.7 | active |
| 964c1a54-9b5a-4242-8575-4ad91b2460b2 | bmc_rhel7.8 | active |
| 4f81157d-ea01-4ff7-b589-2e56d50e7a35 | glance_test | active |
+--------------------------------------+--------------------------------------+--------+
(overcloud) [stack@director ~]$ openstack image show 964c1a54-9b5a-4242-8575-4ad91b2460b2
+------------------+-----------------------------------------------------------------------------
| Field | Value
+------------------+-----------------------------------------------------------------------------
| checksum | 7319b7124fd7b3bffd5ff7cba2ec60e9
| container_format | bare
| created_at | 2020-04-28T07:16:30Z
| disk_format | qcow2
| file | /v2/images/964c1a54-9b5a-4242-8575-4ad91b2460b2/file
| id | 964c1a54-9b5a-4242-8575-4ad91b2460b2
| min_disk | 0
| min_ram | 0
| name | bmc_rhel7.3
| owner | 869bf59fe6e743ed94b96702a3a67bcd
| properties | direct_url='file:///var/lib/glance/images/964c1a54-9b5a-4242-8575-4ad91b2460
| protected | False
| schema | /v2/schemas/image
| size | 573007360
| status | active
| tags |
| updated_at | 2020-04-28T07:16:39Z
| virtual_size | None
| visibility | public
+------------------+-----------------------------------------------------------------------------
(overcloud) [stack@director ~]$ cinder create --image 964c1a54-9b5a-4242-8575-4ad91b2460b2 --v
+--------------------------------+--------------------------------------+
| Property | Value |
+--------------------------------+--------------------------------------+
| attachments | [] |
| availability_zone | nova |
| bootable | false |
| consistencygroup_id | None |
| created_at | 2020-05-14T08:56:27.000000 |
| description | None |
| encrypted | False |
| id | 394e5ff3-273a-492c-bc35-af8e50faec23 |
| metadata | {} |
| migration_status | None |
| multiattach | False |
| name | Test-Volume-on-netapp |
| os-vol-host-attr:host | hostgroup@netapp01#OS_Vol02 |
| os-vol-mig-status-attr:migstat | None |
| os-vol-mig-status-attr:name_id | None |
| os-vol-tenant-attr:tenant_id | 869bf59fe6e743ed94b96702a3a67bcd |
| replication_status | None |
| size | 50 |
| snapshot_id | None |
| source_volid | None |
| status | creating |
| updated_at | 2020-05-14T08:56:27.000000 |
| user_id | d93d94d0eba0474294590ba2d7557b8e |
| volume_type | netapp01 |
+--------------------------------+--------------------------------------+
(overcloud) [stack@director ~]$ cinder list | grep 394e5ff3-273a-492c-bc35-af8e50faec23
| 394e5ff3-273a-492c-bc35-af8e50faec23 | downloading | Test-Volume-on-netapp |
(overcloud) [stack@director ~]$ watch 'cinder list | grep 394e5ff3-273a-492c-bc35-af8e50faec23'
(overcloud) [stack@director ~]$ cinder list | grep 394e5ff3-273a-492c-bc35-af8e50faec23
| 394e5ff3-273a-492c-bc35-af8e50faec23 | available | Test-Volume-on-netapp |
2) 1st clone created from the above volume in Netapp cinder backend only:
(overcloud) [stack@director ~]$ cinder create --source-volid 394e5ff3-273a-492c-bc35-af8e50faec23
+--------------------------------+--------------------------------------+
| Property | Value |
+--------------------------------+--------------------------------------+
| attachments | [] |
| availability_zone | nova |
| bootable | true |
| consistencygroup_id | None |
| created_at | 2020-05-14T09:08:51.000000 |
| description | None |
| encrypted | False |
| id | 18c09de8-738f-4b31-b08d-0d3b2d08a6e4 |
| metadata | {} |
| migration_status | None |
| multiattach | False |
| name | clone-migrate-to-hp3par |
| os-vol-host-attr:host | hostgroup@netapp01#OS_Vol02 |
| os-vol-mig-status-attr:migstat | None |
| os-vol-mig-status-attr:name_id | None |
| os-vol-tenant-attr:tenant_id | 869bf59fe6e743ed94b96702a3a67bcd |
| replication_status | None |
| size | 50 |
| snapshot_id | None |
| source_volid | 394e5ff3-273a-492c-bc35-af8e50faec23 |
| status | creating |
| updated_at | 2020-05-14T09:08:51.000000 |
| user_id | d93d94d0eba0474294590ba2d7557b8e |
| volume_type | netapp01 |
+--------------------------------+--------------------------------------+
(overcloud) [stack@director ~]$ cinder list | grep 18c09de8-738f-4b31-b08d-0d3b2d08a6e4
| 18c09de8-738f-4b31-b08d-0d3b2d08a6e4 | available | clone-migrate-to-hp3par |
3) 2nd clone created from above clone in Netapp cinder backend only:
(overcloud) [stack@director ~]$ cinder create --source-volid 18c09de8-738f-4b31-b08d-0d3b2d08a6e4
+--------------------------------+--------------------------------------+
| Property | Value |
+--------------------------------+--------------------------------------+
| attachments | [] |
| availability_zone | nova |
| bootable | true |
| consistencygroup_id | None |
| created_at | 2020-05-14T09:09:41.000000 |
| description | None |
| encrypted | False |
| id | e1e641ff-fd7b-48fa-9203-cf0f2f091ae2 |
| metadata | {} |
| migration_status | None |
| multiattach | False |
| name | clone2-migrate-to-hp3par |
| os-vol-host-attr:host | hostgroup@netapp01#OS_Vol02 |
| os-vol-mig-status-attr:migstat | None |
| os-vol-mig-status-attr:name_id | None |
| os-vol-tenant-attr:tenant_id | 869bf59fe6e743ed94b96702a3a67bcd |
| replication_status | None |
| size | 50 |
| snapshot_id | None |
| source_volid | 18c09de8-738f-4b31-b08d-0d3b2d08a6e4 |
| status | creating |
| updated_at | 2020-05-14T09:09:41.000000 |
| user_id | d93d94d0eba0474294590ba2d7557b8e |
| volume_type | netapp01 |
+--------------------------------+--------------------------------------+
(overcloud) [stack@director ~]$ cinder list | grep e1e641ff-fd7b-48fa-9203-cf0f2f091ae2
| e1e641ff-fd7b-48fa-9203-cf0f2f091ae2 | available | clone2-migrate-to-hp3par |
(overcloud) [stack@director ~]$ cinder show e1e641ff-fd7b-48fa-9203-cf0f2f091ae2
+--------------------------------+-------------------------------------------------+
| Property | Value |
+--------------------------------+-------------------------------------------------+
| attached_servers | [] |
| attachment_ids | [] |
| availability_zone | nova |
| bootable | true |
| consistencygroup_id | None |
| created_at | 2020-05-14T09:09:41.000000 |
| description | None |
| encrypted | False |
| id | e1e641ff-fd7b-48fa-9203-cf0f2f091ae2 |
| metadata | |
| migration_status | None |
| multiattach | False |
| name | clone2-migrate-to-hp3par |
| os-vol-host-attr:host | hostgroup@netapp01#OS_Vol02 |
| os-vol-mig-status-attr:migstat | None |
| os-vol-mig-status-attr:name_id | None |
| os-vol-tenant-attr:tenant_id | 869bf59fe6e743ed94b96702a3a67bcd |
| replication_status | None |
| size | 50 |
| snapshot_id | None |
| source_volid | 18c09de8-738f-4b31-b08d-0d3b2d08a6e4 |
| status | available |
| updated_at | 2020-05-14T09:09:42.000000 |
| user_id | d93d94d0eba0474294590ba2d7557b8e |
| volume_image_metadata | checksum : 7319b7124fd7b3bffd5ff7cba2ec60e9 |
| | container_format : bare |
| | disk_format : qcow2 |
| | image_id : 964c1a54-9b5a-4242-8575-4ad91b2460b2 |
| | image_name : bmc_rhel7.3 |
| | min_disk : 0 |
| | min_ram : 0 |
| | size : 573007360 |
| volume_type | netapp01 |
+--------------------------------+-------------------------------------------------+
4) Test VM created from the above 2nd clone in Netapp cinder backend only and it went fine:
(overcloud) [stack@director ~]$ openstack server create Test-VM-before-migrate --volume e1e641 --nic net-id=74d3ca03-49d3-46fa-b044-3ff2a5df9243 --availability-zone nova --wait
+-------------------------------------+----------------------------------------------------------
| Field | Value
+-------------------------------------+----------------------------------------------------------
| OS-DCF:diskConfig | MANUAL
| OS-EXT-AZ:availability_zone | nova
| OS-EXT-SRV-ATTR:host | overcloud-compute-0.localdomain
| OS-EXT-SRV-ATTR:hypervisor_hostname | overcloud-compute-0.localdomain
| OS-EXT-SRV-ATTR:instance_name | instance-0000013f
| OS-EXT-STS:power_state | Running
| OS-EXT-STS:task_state | None
| OS-EXT-STS:vm_state | active
| OS-SRV-USG:launched_at | 2020-05-14T09:12:17.000000
| OS-SRV-USG:terminated_at | None
| accessIPv4 |
| accessIPv6 |
| addresses | niam_Avansus_vlan_211=10.10.10.22
| adminPass | 44xdbgBufHNG
| config_drive |
| created | 2020-05-14T09:12:00Z
| flavor | Avansus_PROD_APP_4vcpus_16G (62e2d2e4-b051-417e-a8ff-617e
| hostId | 8ec05a94b67de1e5e0f29e3e710b2ff39e83d1eec934ffab8b5ba3cf
| id | 4bd20a88-81b8-4330-89f5-d096ae330262
| image |
| key_name | None
| name | Test-VM-before-migrate
| progress | 0
| project_id | 869bf59fe6e743ed94b96702a3a67bcd
| properties |
| security_groups | name='default'
| status | ACTIVE
| updated | 2020-05-14T09:12:17Z
| user_id | d93d94d0eba0474294590ba2d7557b8e
| volumes_attached | id='e1e641ff-fd7b-48fa-9203-cf0f2f091ae2'
+-------------------------------------+----------------------------------------------------------
(overcloud) [stack@director ~]$ nova delete 4bd20a88-81b8-4330-89f5-d096ae330262
Request to delete server 4bd20a88-81b8-4330-89f5-d096ae330262 has been accepted.
(overcloud) [stack@director ~]$ cinder show e1e641ff-fd7b-48fa-9203-cf0f2f091ae2
+--------------------------------+-------------------------------------------------+
| Property | Value |
+--------------------------------+-------------------------------------------------+
| attached_servers | [] |
| attachment_ids | [] |
| availability_zone | nova |
| bootable | true |
| consistencygroup_id | None |
| created_at | 2020-05-14T09:09:41.000000 |
| description | None |
| encrypted | False |
| id | e1e641ff-fd7b-48fa-9203-cf0f2f091ae2 |
| metadata | |
| migration_status | None |
| multiattach | False |
| name | clone2-migrate-to-hp3par |
| os-vol-host-attr:host | hostgroup@netapp01#OS_Vol02 |
| os-vol-mig-status-attr:migstat | None |
| os-vol-mig-status-attr:name_id | None |
| os-vol-tenant-attr:tenant_id | 869bf59fe6e743ed94b96702a3a67bcd |
| replication_status | None |
| size | 50 |
| snapshot_id | None |
| source_volid | 18c09de8-738f-4b31-b08d-0d3b2d08a6e4 |
| status | available |
| updated_at | 2020-05-14T09:12:49.000000 |
| user_id | d93d94d0eba0474294590ba2d7557b8e |
| volume_image_metadata | checksum : 7319b7124fd7b3bffd5ff7cba2ec60e9 |
| | container_format : bare |
| | disk_format : qcow2 |
| | image_id : 964c1a54-9b5a-4242-8575-4ad91b2460b2 |
| | image_name : bmc_rhel7.3 |
| | min_disk : 0 |
| | min_ram : 0 |
| | size : 573007360 |
| volume_type | netapp01 |
+--------------------------------+-------------------------------------------------+
(overcloud) [stack@director ~]$ nova show 4bd20a88-81b8-4330-89f5-d096ae330262
ERROR (CommandError): No server with a name or ID of '4bd20a88-81b8-4330-89f5-d096ae330262' exist
5) Same clone volume migrated from Netapp to 3par cinder backend successfully:
(overcloud) [stack@director ~]$ cinder --debug retype --migration-policy on-demand e1e641ff-fd7b-DEBUG:keystoneauth.session:REQ: curl -g -i -X GET http://10.10.10.217:5000//v3 -H "Accept: applic2.14.2 CPython/2.7.5"
DEBUG:keystoneauth.session:RESP: [200] Date: Thu, 14 May 2020 09:14:41 GMT Server: Apache Vary: Xa89-9a6a-0859bedd430a Content-Encoding: gzip Content-Length: 195 Content-Type: application/json
RESP BODY: {"version": {"status": "stable", "updated": "2018-02-28T00:00:00Z", "media-types": [{"v3+json"}], "id": "v3.10", "links": [{"href": "http://10.10.10.217:5000/v3/", "rel": "self"}]}}
DEBUG:keystoneauth.session:GET call to None for http://10.10.10.217:5000//v3 used request id req-
DEBUG:keystoneauth.identity.v3.base:Making authentication request to http://10.10.10.217:5000/v3/
DEBUG:keystoneauth.identity.v3.base:{"token": {"is_domain": false, "methods": ["password"], "roleires_at": "2020-05-14T10:14:42.000000Z", "project": {"domain": {"id": "default", "name": "Defaultg": [{"endpoints": [{"url": "http://10.10.10.217:8776/v2/869bf59fe6e743ed94b96702a3a67bcd", "inte: "1450ec5f608d45b3b8d610d9d77e540d"}, {"url": "http://10.3.162.136:8776/v2/869bf59fe6e743ed94b96d": "regionOne", "id": "651113200ba243bd854f707d1003e794"}, {"url": "http://10.3.162.136:8776/v2/ionOne", "region_id": "regionOne", "id": "a9cc6683f1e0438a8a46eb772e39345b"}], "type": "volumev2"oints": [{"url": "http://10.10.10.217:9696", "interface": "public", "region": "regionOne", "regio"http://10.3.162.136:9696", "interface": "admin", "region": "regionOne", "region_id": "regionOne"36:9696", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "d2a92ffca876c59d43a5e2e4", "name": "neutron"}, {"endpoints": [{"url": "http://10.10.10.217:8004/v1/869bOne", "region_id": "regionOne", "id": "6a9df65ee29042ecb386599a07ddc8ff"}, {"url": "http://10.3.1 "region": "regionOne", "region_id": "regionOne", "id": "d4e3d82f32344939b09b93041bf3f25d"}, {"urterface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "dc6b032393bc47809404a6627bf67d", "name": "heat"}, {"endpoints": [{"url": "http://10.3.161.59:8080", "interface": "ad1974a4d81a68e51743bdc39"}, {"url": "http://10.3.161.59:8080/v1/AUTH_869bf59fe6e743ed94b96702a3a67ionOne", "id": "409b8f29859547e7bad5d93e174a8a3f"}, {"url": "http://10.10.10.217:8080/v1/AUTH_869nOne", "region_id": "regionOne", "id": "b5bb71166c3e4af08f78ecc0ff7e86b4"}], "type": "object-storints": [{"url": "http://10.3.162.136:8776/v3/869bf59fe6e743ed94b96702a3a67bcd", "interface": "int7964d47168a96e424a2ac1514"}, {"url": "http://10.3.162.136:8776/v3/869bf59fe6e743ed94b96702a3a67bce", "id": "d3bdbcfb003b40c5af79c214bc8e438e"}, {"url": "http://10.10.10.217:8776/v3/869bf59fe6e74gion_id": "regionOne", "id": "e7c23c10c3b04563b9c9dae31b60b01b"}], "type": "volume", "id": "a0054": "http://10.3.161.59:35357", "interface": "admin", "region": "regionOne", "region_id": "regionO2.136:5000", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "d22nterface": "public", "region": "regionOne", "region_id": "regionOne", "id": "f7c263b4880949ff96a03bcdf", "name": "keystone"}, {"endpoints": [{"url": "http://10.3.162.136:8778/placement", "interf: "27c8d1d144ab43978c5f9602ff532190"}, {"url": "http://10.3.162.136:8778/placement", "interface":5c8991a44bc9c687c1f4717e0dc"}, {"url": "http://10.10.10.217:8778/placement", "interface": "public4efab3122bedfa022b52"}], "type": "placement", "id": "bd4b6c8af30c441b912e116436afd497", "name": "erface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "60da893d12f9431d909ab24nal", "region": "regionOne", "region_id": "regionOne", "id": "63ed4236a0ce4c7890d743c435be0e3f"},: "regionOne", "region_id": "regionOne", "id": "a5bf95674d33489284a1df22d90de103"}], "type": "clofn"}, {"endpoints": [{"url": "http://10.3.162.136:8774/v2.1", "interface": "admin", "region": "re7281"}, {"url": "http://10.3.162.136:8774/v2.1", "interface": "internal", "region": "regionOne", url": "http://10.10.10.217:8774/v2.1", "interface": "public", "region": "regionOne", "region_id":ute", "id": "dd06fe2bbd1b45d48bc462928b4e509c", "name": "nova"}, {"endpoints": [{"url": "http://1blic", "region": "regionOne", "region_id": "regionOne", "id": "0acba966ddca48d4b2198c96784f1f36"}", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "8de16d4a59dd4b4ded94b96702a3a67bcd", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "ia4a06d3f2f430280fa99402069693a", "name": "cinderv3"}, {"endpoints": [{"url": "http://10.3.162.136onOne", "id": "224030bc60514871b97c102085513494"}, {"url": "http://10.3.162.136:9292", "interface5c509104878a42638e5d7d779af01061"}, {"url": "http://10.10.10.217:9292", "interface": "public", "r1fa2e2a9c11c7f9"}], "type": "image", "id": "eef5d90ee7cd46d4ad699be31802bd82", "name": "glance"}]pires_at": null, "name": "admin", "id": "d93d94d0eba0474294590ba2d7557b8e"}, "audit_ids": ["THtZe
DEBUG:keystoneauth:REQ: curl -g -i -X GET http://10.10.10.217:8776/v3/869bf59fe6e743ed94b96702a3aon-cinderclient" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}28a6c26cc438ff58c492f545c2
DEBUG:keystoneauth:RESP: [200] Date: Thu, 14 May 2020 09:14:42 GMT Server: Apache x-compute-requevolume 3.0 Vary: OpenStack-API-Version,Accept-Encoding x-openstack-request-id: req-670b264c-8cef-nt-Type: application/json
RESP BODY: {"volume": {"migration_status": null, "attachments": [], "links": [{"href": "http://107b-48fa-9203-cf0f2f091ae2", "rel": "self"}, {"href": "http://10.10.10.217:8776/869bf59fe6e743ed94ookmark"}], "availability_zone": "nova", "os-vol-host-attr:host": "hostgroup@netapp01#OS_Vol02", ation_status": null, "snapshot_id": null, "id": "e1e641ff-fd7b-48fa-9203-cf0f2f091ae2", "size": 5enant_id": "869bf59fe6e743ed94b96702a3a67bcd", "os-vol-mig-status-attr:migstat": null, "metadata"9b7124fd7b3bffd5ff7cba2ec60e9", "min_ram": "0", "disk_format": "qcow2", "image_name": "bmc_rhel7.at": "bare", "min_disk": "0", "size": "573007360"}, "description": null, "multiattach": false, "s_id": null, "os-vol-mig-status-attr:name_id": null, "name": "clone2-migrate-to-hp3par", "bootablenetapp01"}}
DEBUG:keystoneauth:GET call to volumev3 for http://10.10.10.217:8776/v3/869bf59fe6e743ed94b96702aq-670b264c-8cef-4fa8-ae62-40d56fcc824c
DEBUG:keystoneauth:REQ: curl -g -i -X POST http://10.10.10.217:8776/v3/869bf59fe6e743ed94b96702a3nt: python-cinderclient" -H "Content-Type: application/json" -H "X-OpenStack-Request-ID: req-670buth-Token: {SHA1}28a6c26cc438ff58c492f545c2b2c753c901769a" -d '{"os-retype": {"new_type": "3par01
DEBUG:keystoneauth:RESP: [202] Date: Thu, 14 May 2020 09:14:42 GMT Server: Apache x-compute-requevolume 3.0 Vary: OpenStack-API-Version x-openstack-request-id: req-edcc2612-9432-4522-bc2e-f4f5fb
DEBUG:keystoneauth:POST call to volumev3 for http://10.10.10.217:8776/v3/869bf59fe6e743ed94b96702st id req-edcc2612-9432-4522-bc2e-f4f5fbefbf18
(overcloud) [stack@director ~]$ cinder show e1e641ff-fd7b-48fa-9203-cf0f2f091ae2 | grep status
| migration_status | migrating |
| os-vol-mig-status-attr:migstat | migrating |
| os-vol-mig-status-attr:name_id | None |
| replication_status | None |
| status | retyping |
(overcloud) [stack@director ~]$ cinder show e1e641ff-fd7b-48fa-9203-cf0f2f091ae2
+--------------------------------+-------------------------------------------------+
| Property | Value |
+--------------------------------+-------------------------------------------------+
| attached_servers | [] |
| attachment_ids | [] |
| availability_zone | nova |
| bootable | true |
| consistencygroup_id | None |
| created_at | 2020-05-14T09:09:41.000000 |
| description | None |
| encrypted | False |
| id | e1e641ff-fd7b-48fa-9203-cf0f2f091ae2 |
| metadata | |
| migration_status | success |
| multiattach | False |
| name | clone2-migrate-to-hp3par |
| os-vol-host-attr:host | hostgroup@3par01#CPG_SSD_R6 |
| os-vol-mig-status-attr:migstat | success |
| os-vol-mig-status-attr:name_id | e5ecdee2-952d-4b7e-8624-a28057a5d82c |
| os-vol-tenant-attr:tenant_id | 869bf59fe6e743ed94b96702a3a67bcd |
| replication_status | None |
| size | 50 |
| snapshot_id | None |
| source_volid | 18c09de8-738f-4b31-b08d-0d3b2d08a6e4 |
| status | available |
| updated_at | 2020-05-14T09:21:38.000000 |
| user_id | d93d94d0eba0474294590ba2d7557b8e |
| volume_image_metadata | checksum : 7319b7124fd7b3bffd5ff7cba2ec60e9 |
| | container_format : bare |
| | disk_format : qcow2 |
| | image_id : 964c1a54-9b5a-4242-8575-4ad91b2460b2 |
| | image_name : bmc_rhel7.3 |
| | min_disk : 0 |
| | min_ram : 0 |
| | size : 573007360 |
| volume_type | 3par01 |
+--------------------------------+-------------------------------------------------+
6) VM creation fails from the migrated volume in 3par cinder backend:
((overcloud) [stack@director ~]$ openstack server create Test-VM-after-migrate02 --volume e1e641ff-fd7b-48fa-9203-cf0f2f091ae2 --flavor Avansus_PROD_APP_4vcpus_16G --nic net-id=74d3ca03-49d3-46fa-b044-3ff2a5df9243 --availability-zone nova --wait
Error creating server: Test-VM-after-migrate02
Error creating server
(overcloud) [stack@director ~]$ cinder list | grep e1e641ff-fd7b-48fa-9203-cf0f2f091ae2
| e1e641ff-fd7b-48fa-9203-cf0f2f091ae2 | available | clone2-migrate-to-hp3par | 50 | 3par01 | true | |
(overcloud) [stack@director ~]$ cinder show e1e641ff-fd7b-48fa-9203-cf0f2f091ae2
+--------------------------------+-------------------------------------------------+
| Property | Value |
+--------------------------------+-------------------------------------------------+
| attached_servers | [] |
| attachment_ids | [] |
| availability_zone | nova |
| bootable | true |
| consistencygroup_id | None |
| created_at | 2020-05-14T09:09:41.000000 |
| description | None |
| encrypted | False |
| id | e1e641ff-fd7b-48fa-9203-cf0f2f091ae2 |
| metadata | |
| migration_status | success |
| multiattach | False |
| name | clone2-migrate-to-hp3par |
| os-vol-host-attr:host | hostgroup@3par01#CPG_SSD_R6 |
| os-vol-mig-status-attr:migstat | success |
| os-vol-mig-status-attr:name_id | e5ecdee2-952d-4b7e-8624-a28057a5d82c |
| os-vol-tenant-attr:tenant_id | 869bf59fe6e743ed94b96702a3a67bcd |
| replication_status | None |
| size | 50 |
| snapshot_id | None |
| source_volid | 18c09de8-738f-4b31-b08d-0d3b2d08a6e4 |
| status | available |
| updated_at | 2020-05-14T10:43:08.000000 |
| user_id | d93d94d0eba0474294590ba2d7557b8e |
| volume_image_metadata | checksum : 7319b7124fd7b3bffd5ff7cba2ec60e9 |
| | container_format : bare |
| | disk_format : qcow2 |
| | image_id : 964c1a54-9b5a-4242-8575-4ad91b2460b2 |
| | image_name : bmc_rhel7.3 |
| | min_disk : 0 |
| | min_ram : 0 |
| | size : 573007360 |
| volume_type | 3par01 |
+--------------------------------+-------------------------------------------------+
(overcloud) [stack@director ~]$ nova show Test-VM-after-migrate02
+--------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Property | Value |
+--------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| OS-DCF:diskConfig | MANUAL |
| OS-EXT-AZ:availability_zone | |
| OS-EXT-SRV-ATTR:host | - |
| OS-EXT-SRV-ATTR:hostname | test-vm-after-migrate02 |
| OS-EXT-SRV-ATTR:hypervisor_hostname | - |
| OS-EXT-SRV-ATTR:instance_name | instance-00000148 |
| OS-EXT-SRV-ATTR:kernel_id | |
| OS-EXT-SRV-ATTR:launch_index | 0 |
| OS-EXT-SRV-ATTR:ramdisk_id | |
| OS-EXT-SRV-ATTR:reservation_id | r-f1u69aft |
| OS-EXT-SRV-ATTR:root_device_name | /dev/vda |
| OS-EXT-SRV-ATTR:user_data | - |
| OS-EXT-STS:power_state | 0 |
| OS-EXT-STS:task_state | - |
| OS-EXT-STS:vm_state | error |
| OS-SRV-USG:launched_at | - |
| OS-SRV-USG:terminated_at | - |
| accessIPv4 | |
| accessIPv6 | |
| config_drive | |
| created | 2020-05-14T10:42:51Z |
| description | Test-VM-after-migrate02 |
| fault | {"message": "Build of instance 2447bab0-62dd-4b59-a9cd-8662a4488902 aborted: Unable to update attachment.(Bad or unexpected response from the storage volume backend API: Driver initialize connection failed (error: Not found (HTTP 404) 23 - volume does not exist).). (H", "code": 500, "details": "Traceback (most recent call last): |
| | File \"/usr/lib/python2.7/site-packages/nova/compute/manager.py\", line 2008, in _do_build_and_run_instance |
| | filter_properties, request_spec) |
| | File \"/usr/lib/python2.7/site-packages/nova/compute/manager.py\", line 2234, in _build_and_run_instance |
| | bdms=block_device_mapping) |
| | File \"/usr/lib/python2.7/site-packages/oslo_utils/excutils.py\", line 220, in __exit__ |
| | self.force_reraise() |
| | File \"/usr/lib/python2.7/site-packages/oslo_utils/excutils.py\", line 196, in force_reraise |
| | six.reraise(self.type_, self.value, self.tb) |
| | File \"/usr/lib/python2.7/site-packages/nova/compute/manager.py\", line 2186, in _build_and_run_instance |
| | block_device_mapping) as resources: |
| | File \"/usr/lib64/python2.7/contextlib.py\", line 17, in __enter__ |
| | return self.gen.next() |
| | File \"/usr/lib/python2.7/site-packages/nova/compute/manager.py\", line 2396, in _build_resources |
| | reason=e.format_message()) |
| | BuildAbortException: Build of instance 2447bab0-62dd-4b59-a9cd-8662a4488902 aborted: Unable to update attachment.(Bad or unexpected response from the storage volume backend API: Driver initialize connection failed (error: Not found (HTTP 404) 23 - volume does not exist).). (HTTP 500) (Request-ID: req-6af8c8be-9f73-4077-856f-9d9f27d6b45c) |
| | ", "created": "2020-05-14T10:43:08Z"} |
| flavor:disk | 0 |
| flavor:ephemeral | 0 |
| flavor:extra_specs | {} |
| flavor:original_name | Avansus_PROD_APP_4vcpus_16G |
| flavor:ram | 16384 |
| flavor:swap | 0 |
| flavor:vcpus | 4 |
| hostId | |
| host_status | |
| id | 2447bab0-62dd-4b59-a9cd-8662a4488902 |
| image | Attempt to boot from volume - no image supplied |
| key_name | - |
| locked | False |
| metadata | {} |
| name | Test-VM-after-migrate02 |
| os-extended-volumes:volumes_attached | [{"id": "e1e641ff-fd7b-48fa-9203-cf0f2f091ae2", "delete_on_termination": false}] |
| status | ERROR |
| tags | [] |
| tenant_id | 869bf59fe6e743ed94b96702a3a67bcd |
| updated | 2020-05-14T10:43:08Z |
| user_id | d93d94d0eba0474294590ba2d7557b8e |
+--------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
7) /var/log/containers/cinder/cinder-volume.log
contains the following error:
2020-05-14 16:13:05.484 46 DEBUG hpe3parclient.http [req-6af8c8be-9f73-4077-856f-9d9f27d6b45c d93d94d0eba0474294590ba2d7557b8e 869bf59fe6e743ed94b96702a3a67bcd - default default] RESP BODY:
_http_log_resp /usr/lib/python2.7/site-packages/hpe3parclient/http.py:185
2020-05-14 16:13:05.485 46 DEBUG cinder.coordination [req-6af8c8be-9f73-4077-856f-9d9f27d6b45c d93d94d0eba0474294590ba2d7557b8e 869bf59fe6e743ed94b96702a3a67bcd - default default] Lock "/var/lib/cinder/cinder-3par-e1e641ff-fd7b-48fa-9203-cf0f2f091ae2" released by "initialize_connection" :: held 8.593s _synchronized /usr/lib/python2.7/site-packages/cinder/coordination.py:162
2020-05-14 16:13:05.485 46 DEBUG cinder.volume.drivers.hpe.hpe_3par_fc [req-6af8c8be-9f73-4077-856f-9d9f27d6b45c d93d94d0eba0474294590ba2d7557b8e 869bf59fe6e743ed94b96702a3a67bcd - default default] <== decorator: exception (8594ms) HTTPNotFound() trace_logging_wrapper /usr/lib/python2.7/site-packages/cinder/utils.py:924
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager [req-6af8c8be-9f73-4077-856f-9d9f27d6b45c d93d94d0eba0474294590ba2d7557b8e 869bf59fe6e743ed94b96702a3a67bcd - default default] Driver initialize connection failed (error: Not found (HTTP 404) 23 - volume does not exist).: HTTPNotFound: Not found (HTTP 404) 23 - volume does not exist
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager Traceback (most recent call last):
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 4375, in _connection_create
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager conn_info = self.driver.initialize_connection(volume, connector)
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager File "/usr/lib/python2.7/site-packages/cinder/utils.py", line 918, in trace_logging_wrapper
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager result = f(*args, **kwargs)
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager File "/usr/lib/python2.7/site-packages/cinder/zonemanager/utils.py", line 80, in decorator
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager conn_info = initialize_connection(self, *args, **kwargs)
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager File "<string>", line 2, in initialize_connection
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager File "/usr/lib/python2.7/site-packages/cinder/coordination.py", line 151, in _synchronized
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager return f(*a, **k)
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/hpe/hpe_3par_fc.py", line 175, in initialize_connection
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager host = self._create_host(common, volume, connector)
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/hpe/hpe_3par_fc.py", line 373, in _create_host
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager cpg = common.get_cpg(volume, allowSnap=True)
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/hpe/hpe_3par_common.py", line 1957, in get_cpg
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager vol = self.client.getVolume(volume_name)
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager File "/usr/lib/python2.7/site-packages/hpe3parclient/client.py", line 464, in getVolume
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager response, body = self.http.get('/volumes/%s' % name)
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager File "/usr/lib/python2.7/site-packages/hpe3parclient/http.py", line 352, in get
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager return self._cs_request(url, 'GET', **kwargs)
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager File "/usr/lib/python2.7/site-packages/hpe3parclient/http.py", line 321, in _cs_request
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager **kwargs)
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager File "/usr/lib/python2.7/site-packages/hpe3parclient/http.py", line 297, in _time_request
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager resp, body = self.request(url, method, **kwargs)
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager File "/usr/lib/python2.7/site-packages/hpe3parclient/http.py", line 262, in request
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager raise exceptions.from_response(resp, body)
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager HTTPNotFound: Not found (HTTP 404) 23 - volume does not exist
2020-05-14 16:13:05.487 46 ERROR cinder.volume.manager
2020-05-14 16:13:05.498 46 ERROR oslo_messaging.rpc.server [req-6af8c8be-9f73-4077-856f-9d9f27d6b45c d93d94d0eba0474294590ba2d7557b8e 869bf59fe6e743ed94b96702a3a67bcd - default default] Exception during message handling: VolumeBackendAPIException: Bad or unexpected response from the storage volume backend API: Driver initialize connection failed (error: Not found (HTTP 404) 23 - volume does not exist).
2020-05-14 16:13:05.498 46 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2020-05-14 16:13:05.498 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 166, in _process_incoming
2020-05-14 16:13:05.498 46 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
2020-05-14 16:13:05.498 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 220, in dispatch
2020-05-14 16:13:05.498 46 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
2020-05-14 16:13:05.498 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 190, in _do_dispatch
2020-05-14 16:13:05.498 46 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args)
2020-05-14 16:13:05.498 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 4422, in attachment_update
2020-05-14 16:13:05.498 46 ERROR oslo_messaging.rpc.server connector)
2020-05-14 16:13:05.498 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 4381, in _connection_create
2020-05-14 16:13:05.498 46 ERROR oslo_messaging.rpc.server raise exception.VolumeBackendAPIException(data=err_msg)
2020-05-14 16:13:05.498 46 ERROR oslo_messaging.rpc.server VolumeBackendAPIException: Bad or unexpected response from the storage volume backend API: Driver initialize connection failed (error: Not found (HTTP 404) 23 - volume does not exist).
2020-05-14 16:13:05.498 46 ERROR oslo_messaging.rpc.server
Environment
- Red Hat OpenStack Platform 13.0 (RHOSP)
Subscriber exclusive content
A Red Hat subscription provides unlimited access to our knowledgebase, tools, and much more.