Unbootable vm's after storage migration

Latest response

Hi,

Rhev 3.1. Hypervisors: 6.4 - 20130318.1.el6_4

Lately after moving vm's between storage domains, we've had a lot of unbootable vm's. We discovered, it tried to boot from another vdisk instead of the primary.

The Solution is to deactive/activate all but the primary vdisk, and then the vm boots.

Not sure what exactly causes this, it's not reproducable every time, but it happens fairly often at our site. Wanted to mention it here, in case anyone else has the same problem.

Regards Brian

Responses

Thanks for sharing that, Brian. Interested to hear if anyone else has encountered this issue, and if so, whether this solution was helpful.

Whoa!  Thas sound a little bit like the issue we've encountered with non-bootable VM's after a snapshot removal. Again, not reproducable every time. Be interested to understand if they are at all related.    I shall update this thread with any updates I get on this topic.

Cheers

Brian, I believe the issue Rich is referring to is the one he's outlined in this discussion: https://access.redhat.com/discussion/rhevm-vm-no-longer-bootable-after-snapshot-removal
 

Let us know if it sounds similar to you.

Richards post is what triggered me to post our findings. But we haven't trashed any vm's; the after the deactivate/activate fix everything is ok again. Maybe Richard can post if he's snapshotting on multiple-disks vm's and if the fix works for him?

I'll try to find time to see if I can to do some snapshotting this week, and post my findings.

/Brian