Skip to main content

Stack Exchange Network

Stack Exchange network consists of 183 Q&A communities including Stack Overflow, the largest, most trusted online community for developers to learn, share their knowledge, and build their careers.

Visit Stack Exchange
Asked
Viewed 2k times
4

I've a CentOS 7 server that isn't booting anymore, ends in grub rescue with following error:

error: disk `mduuid/7d1ea14d586da4ea38a163ecc38fa93f' not found.

I've booted into systemrescue ISO, mounted the RAID array, chrooted to the array, and tried to install the grub via grub2-install on both disks, sda and sdb, both returning the same error:

    [root@sysrescue /]# grub2-install /dev/sda
    Installing for i386-pc platform.
    grub2-install: error: disk `mduuid/7d1ea14d586da4ea38a163ecc38fa93f' not found.

The RAID10 array looks as below:

/dev/md127:
           Version : 1.0
     Creation Time : Wed Jan 25 15:15:48 2017
        Raid Level : raid10
        Array Size : 1953520640 (1863.02 GiB 2000.41 GB)
     Used Dev Size : 976760320 (931.51 GiB 1000.20 GB)
      Raid Devices : 4
     Total Devices : 4
       Persistence : Superblock is persistent

     Intent Bitmap : Internal

       Update Time : Wed Dec 30 18:11:18 2020
             State : clean, degraded
    Active Devices : 2
   Working Devices : 2
    Failed Devices : 2
     Spare Devices : 0

            Layout : near=2
        Chunk Size : 512K

Consistency Policy : bitmap

              Name : pilot.one:root  (local to host hv.christine)
              UUID : 7d1ea14d:586da4ea:38a163ec:c38fa93f
            Events : 2428023

    Number   Major   Minor   RaidDevice State
       -       0        0        0      removed
       1       8       17        1      active sync set-B   /dev/sdb1
       -       0        0        2      removed
       3       8       49        3      active sync set-B   /dev/sda1

I can't figure out why it keeps complaining about that md uuid and most importantly - how to resolve it. Could it be grub2 bug? The data on that array is all fine, I'm running RAID10 in degraded mode with 2 disks removed out of 4 total.

If I remember good, I had similar issue last time on other server and booting into rescue ISO, mounting the RAID array and running grub2-install on the /dev/sda was all what I needed to fix the problem, not this time it seems. Any help appreciated.

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.

Morty Proxy This is a proxified and sanitized view of the page, visit original site.