Jump to content
  • Unraid 6.6.0 under ESXi - boot failure when > 1 core


    scott47
    • Solved Minor

    I have my unRaid setup running under ESXi 6 with 8 virtual cores and 16 GB ram.  This worked fine with all prior versions, including 6.5.3, but with 6.6.0 the boot gets stuck on "x86: Booting SMP configuration:"

     

    When this happened, I first moved back to 6.5.3 and unRaid booted as expected.  Then I re-downloaded the 6.6.0 update and tried booting again.  It failed at the same spot.  Lastly, I tried changing the number of cores in ESXi from 8 (4 virtual sockets with 2 cores per socket) to only 1 core (1 virtual socket with 1 core per socket).  unRaid 6.6.0 booted up just fine after that.  Currently, this is how I am running unRaid.

     

    My system includes a Supermicro X8DTH-i motherboard, 2 x 5690 Xeon CPU (provides 12 physical cores + 12 hyperthreaded cores) and 96 GB ECC DDR3 RAR (48 GB per CPU).  This is running ESXi-6.0.0-20160302001 -standard version.

     

    Honestly, I'm not 100% sure this is a bug or a result of running under ESXi, but I would sure like to run unRaid 6.6.0 and I have to continue running it under ESXi.

     

    Thanks for your help!

    Scott

     

     

     

    unRaidBootStuck.jpg

    unRaidBootStuck2.jpg

    tower-diagnostics-20180921-0829.zip




    User Feedback

    Recommended Comments

    If you can figure out what changed and what needs to be updated to fix this, I'm sure limetech would be happy to consider modifying unraid to get it to work, as long as the fix doesn't effect bare metal users negatively.

     

    That said, you are pretty much on your own figuring out a solution. There is a small sub community here that runs unraid under ESXI, but there is no official support for doing so.

     

    Limetech's position on running unraid under a hypervisor is pretty much as follows.

    They don't actively discourage it, but it's up to you to get it running. Any issues with unraid must be reproduceable while running bare metal in order to get troubleshooting assistance from them.

     

    I suggest you get together with the other people running unraid under ESXI and collaborate with them about this issue.

    Link to comment

    Thanks, Jonathan, that is kind of what I figured. 🙂

     

    I did figure out a few things, and while this is definitely an ESXi issue, I'm going to add them here in case it might help anyone else (and in case I need to remember what I found).

     

    First, I am using ESXi 6.0 and I don't know if any of the below applies to any other versions.

     

    For 6.0, the "ESXi600-201808001.zip" patch fixes the issue I reported above where the Unraid booting process stops on Booting SMP Configuration.  This means that anyone experiencing this issue should be able to fix it by updating to the 201808001 patch or newer.

     

    In my case, I had one other problem in that I am using 2x Supermicro  AOC-SAS2LP-MV8 SAS SATA cards (Marvell 88se9485) and need to pass them through ESXi to Unraid.  This works until the "ESXi600-201711001.zip" patch, which broke something.  So far I haven't been able to get the cards to work with the 201711001 patch or newer.

     

    What I ended up doing was the run the following in an SSH connection to ESXi:

    1. esxcli software vib install -d "/vmfs/volumes/datastore1/Patches/update-from-esxi6.0-6.0_update03.zip"
    2. esxcli software vib install -d "/vmfs/volumes/datastore1/Patches/ESXi600-201710001.zip"
    3. esxcli software vib update -d "/vmfs/volumes/datastore1/Patches/ESXi600-201808001.zip" --vibname cpu-microcode

    Step 1 updates ESXi to the most recent EXSi update.  The path is where I saved the zip files, your location may differ.

    Step 2 installs the latest patches that will work with my SAS SATA cards.

    Step 3 installs just the vib that is needed to allow Unraid to boot in multi-processor mode.

     

    Honestly, I don't know for sure that this set up won't cause any problems down the line but so far I have tested with Unraid, WIndows, OSX and Linux VM's in ESXi and everything seems to be working perfectly.

     

    Link to comment


    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Restore formatting

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Status Definitions

     

    Open = Under consideration.

     

    Solved = The issue has been resolved.

     

    Solved version = The issue has been resolved in the indicated release version.

     

    Closed = Feedback or opinion better posted on our forum for discussion. Also for reports we cannot reproduce or need more information. In this case just add a comment and we will review it again.

     

    Retest = Please retest in latest release.


    Priority Definitions

     

    Minor = Something not working correctly.

     

    Urgent = Server crash, data loss, or other showstopper.

     

    Annoyance = Doesn't affect functionality but should be fixed.

     

    Other = Announcement or other non-issue.

×
×
  • Create New...