UnRAID on Asus Pro WS W680-ACE IPMI


Recommended Posts

I can report that I am successfully running UnRAID with the Asus Pro WS W680-ACE, a "Raptorlake" CPU and this Crucial memory (2 x 32GB DDR5 ECC sticks at 4800) for weeks. I am using the four SATA connectors on the motherboard as well as the four additional ones available through a "slim SAS" connector (I ordered this cable from Amazon that seem to be ok quality).

I have not managed to detect any other temperatures than the CPU, acpitz and my nvme-pci-0400 with the temperature plugin. As cache I use Kingston DC1500M U.2 Enterprise 2,5'' SSD (using this  adaptercard) .I have not yet had time to test the IPMI card but that is on my to-do list - not sure if it will provide any options for more advanced management of the hardware from UnRAID (the IPMI plug-ins I have seen seem to only target specific motherboard brands and not ASUS)...

 

Edited by NAS-newbie
Link to comment
14 hours ago, NAS-newbie said:

I can report that I am successfully running UnRAID with the Asus Pro WS W680-ACE, a "Raptorlake" CPU and this Crucial memory (2 x 32GB DDR5 ECC sticks at 4800) for weeks. I am using the four SATA connectors on the motherboard as well as the four additional ones available through a "slim SAS" connector (I ordered this cable from Amazon that seem to be ok quality).

I have not managed to detect any other temperatures than the CPU, acpitz and my nvme-pci-0400 with the temperature plugin. As cache I use Kingston DC1500M U.2 Enterprise 2,5'' SSD (using this  adaptercard) .I have not yet had time to test the IPMI card but that is on my to-do list - not sure if it will provide any options for more advanced management of the hardware from UnRAID (the IPMI plug-ins I have seen seem to only target specific motherboard brands and not ASUS)...

 

+1! 

 

I am also using the same board, with a 13500. 

I am also using slimsas cable with 4x sata cables.

I'm using kingston memory with reports single bit ecc 

        Total Width: 80 bits
        Data Width: 64 bits

 

I use the ipmi card and it's very good. I get a lot of temperature monitoring and power info and use it to control the case fan speeds. I switched as the sensor driver is trying to use nct 6775 (if i remember correctly) which has a well documented bug, but there is a new driver in the 6.2 kernel coming - (whenever limetech move to that)

 

very happy with the board, although it seems the top row of usb ports (looking at it vertically) are a bit picky on what they will detect, I had to plugin my UPS into a lower bank of usb ports to get it detected.

 

Edited by klippertyk
Link to comment
  • 3 weeks later...

@NAS-newbie Would you happen to know if your RAM is single-bit or multi-bit ECC? The Unraid Dashboard should show it on the Memory panel -- at least it does on my current Unraid server with DDR3:

1283571342_Screenshot2023-06-17234605.png.4ff8d9b7348867be16a6d7d83fbf383e.png

 

I'm re-platforming/upgrading the server and have the same motherboard with a 13700k. Trying to pick the best RAM from what I believe are the only options in the world right now for ECC DDR5. I'm not sure which is single/multi-bit.

  • Micron/Supermicro? This part number seems a little different to the one you shared
  • Kingston (SK Hynix)
  • SK Hynix

 

It'll also be awesome if you could fix the link to your SlimSAS cable in the OP 😉

 

Thanks in advance!

 

Edited by Omid
Link to comment

Can also say I so far really like the build quality and features of the motherboard BIOS etc. It is not an inexpensive board but in this case it seems to deliver what one pay for.

And as you say, at least in Sweden where I live, it seems to be the only motherboard you can ACTUALLY BUY if you want a "raptor lake" CPU with ECC memory so not much of a choice.

  • Upvote 1
Link to comment

Thanks for confirming and for the link! Now to try find the cable in the UK... 😅 

 

And I totally agree with you regarding the motherboard. The build quality and finish isn't too far off ASUS's ROG series of motherboards, so I feel like it's using the same premium components. Its features, ports and PCIe configurations seem very appropriate for an Unraid build (e.g. no bifurcation when maxing out M.2 slots or the SlimSAS connector). My only qualm is that the big cooler I got for the 13700k, the Noctua NH-D15, will be almost touching the IPMI card if I put it in the top PCIe x1 slot (1mm gap; maybe less!). I'm pretty sure they will eventually touch after some time, because...gravity.

 

I'll confirm that I received 2x 32GB Kingston (KSM48E40BD8KM-32HM) and am currently putting them through memtest where they were detected as supporting ECC.

Link to comment

So, I've finally reached some challenges with this motherboard. Well, more specifically, its IPMI card which doesn't seem to be documented too well. For starters, the full manual (only available online) says that the box should contain an SPI cable and a TPM adapter; neither of which I appear to have. This is also inconsistent with the list of accessories the Quick Start guide mentions, which doesn't include those two items. But never mind, it doesn't seem like those are mandatory.

 

It would be great if anyone who purchased this motherboard (the IPMI variant) could confirm what cables they received in the box.

 

The BIOS detects and shows some details from the IPMI card, so it appears to be functioning fine. My issue now is that I can open the web interface, but it's not accepting the default credentials (admin:admin). I can't find a way to change/reset the password via the BIOS or via something like a jumper. Some results online for older versions suggest flashing the card or potentially running some commands to do it, but I'd need to boot into something and I'd rather do that before moving Unraid into this build.

 

I'm wondering if anyone can confirm what their default credentials were. I've already tried many variations with CAPS, the full administrator word, etc.

916718187_Screenshot2023-06-22at23_57_43.png.d68ded347422b6537824fc8337197ce8.png

Thanks!

Link to comment

-------- Edit: I'm a fool, but I'll leave this post up for comedy value. FYI - the cable clips it firmly when it's in the right way (tab facing out) 😄🤦‍♂️

And set SlimSAS to SATA under 

Advanced > Onboard Devices Configuration > SlimSAS Configuration

-------- End of Edit

 

Hey @NAS-newbie Did you have to do anything for the SAS cable to work? I went with this one in the end, which matches the specs of the one you shared -- thanks for your help!

 

However, none of the drives connected to it are detected by Unraid or the BIOS. I don't really hear or feel a click when plugging the cable into the motherboard, so I'm not sure if I'm simply not using enough force. The push-tab thing faces the back side of the motherboard, right? So you can't see it. I'm not familiar with these SAS cables/interfaces, but I'll now try just using more force.

 

Just wanted to check if I've missed something or whether it really is just meant to be plug 'n' play?

 

I saw a setting in the BIOS for if SlimSAS should be PCIE (default) or SATA. I changed it to SATA but that didn't make a difference.

Edited by Omid
Link to comment

No worries, but I really appreciate the response. It's great to see this community is still as alive as ever!

Issue resolved. I thought my edit would beat your reply but you were too damn quick!

 

Thanks again! I'm sure this thread will help someone else in the future *high five*

Edited by Omid
Link to comment

Happy to hear you got it working - I actually did some performance measurements and compared the 4 fixed SATA with the four over the slim SAS connector and as expected there where no difference in performance level (due to same protocol and nominal speed) but for some reason the performance variation over time in both bandwidth and latency was a little bit lower (i.e. less jitter) - perhaps there is another controller in the chipset or some other difference between them?!

Anyhow I am using these slightly "better" slim SAS/SATA for my parity and largest/most used drives - this is partly a hobby project for me so making every little improvement I can think of is part of the fun 🙂

  • Thanks 1
Link to comment

Yeah, I wouldn't expect a noticeable difference between them since they both connect via the Chipset. In terms of storage on this motherboard, I think only M.2_1 is direct to the CPU (and that would be an unfair comparison 😝).

 

Interesting point about the jitter you notice. I'll have to keep an eye on that over time. I wonder if that could be due to the cables used(?). I don't know about you, but I use whatever SATA cables I have laying around, which most likely came bundled with another purchase. The SAS cable seems a bit more premium (e.g. less noise?) 🤷‍♂️

 

Nice logic for choosing which drives to connect via the SAS cable 👍 Also just a hobby with a home server over here.

Link to comment

I'm still unable to login to the IPMI console 🙁 The default credentials simply don't work. It's as if someone has already used it and set their own password.

 

I need to work out how to flash/reset the card but there's absolutely no information about it online...

Link to comment
1 hour ago, Omid said:

Yeah, I wouldn't expect a noticeable difference between them since they both connect via the Chipset. In terms of storage on this motherboard, I think only M.2_1 is direct to the CPU (and that would be an unfair comparison 😝).

 

Interesting point about the jitter you notice. I'll have to keep an eye on that over time. I wonder if that could be due to the cables used(?). I don't know about you, but I use whatever SATA cables I have laying around, which most likely came bundled with another purchase. The SAS cable seems a bit more premium (e.g. less noise?) 🤷‍♂️

 

Nice logic for choosing which drives to connect via the SAS cable 👍 Also just a hobby with a home server over here.

Yes could be the SATA cable I used... just like you say I rarely keep track of new or old rather just have a box with them and other disk accessories.... The slim SAS cable looked definitely more premium and more importantly was for sure new and not bent and folded many times as old cables may have endured over time.... 

Link to comment
  • 1 month later...
  • 2 weeks later...
  • 2 weeks later...

I just purchased the MicroATX version of this board: The Asus Pro WS W680M-ACE SE. Bought it directly from Asus since stores don't have it in stock yet (it was only launched a few weeks ago), and it arrived within a few days. It's very similar to the full-size ATX version but only has two M.2 slots and three PCIe slots (one 5.0 x16, one 4.0 x4 and one 3.0 x1). It also has the IPMI/BMC directly onboard rather than using a separate card.

 

Haven't tried it with Unraid yet - will likely set it up over the weekend.

 

On 6/17/2023 at 3:48 PM, Omid said:

Would you happen to know if your RAM is single-bit or multi-bit ECC?

As far as I know, the only RAM that supports multi-bit ECC at the moment is buffered memory, whereas this motherboard only takes unbuffered. "Proper" expensive enterprise servers almost always use buffered RAM, so unbuffered ECC RAM has a much smaller market.

 

I bought 2 x 32GB Kingston KSM48E40BD8KM-32HM DDR5 ECC since it's one of the only models that's listed in the compatibility list for this motherboard, and I could get them for $100 each with an employee discount at one of the suppliers we use at work. It's only single-bit ECC though.

Edited by Daniel15
Link to comment

I'm starting to appreciate the on-board IPMI on my previous Supermicro motherboard. You've got yourself a nice board if you're happy with the two M.2 slots. Three PCIe slots is still plenty for an Unraid build IMO.

 

Thanks for the added context on ECC options. My Kingston KSM48E40BD8KM-32HM modules have worked out great so far!

 

Good luck with the build!

Link to comment
On 8/7/2023 at 2:18 AM, mikeyosm said:

Can anyone in this thread confirm if the IPMI card is picked up by the UNRAID IPMI plugin and shows the stats on the UNRAID dashboard?

 

Seems like it works:

image.thumb.png.80173856429841b4714ded99d7a51376.png

 

I had to adjust the alerts in the IPMI web interface because I'm using a Noctua NH-D15 CPU cooler and the fans spin so slowly at the lowest setting (360RPM) that it thought the fans were broken, so I kept seeing alerts about low fan speed 😂. Out-of-the-box, the IPMI systems tend to expect the small, high RPM server fans.

 

Unfortunately the PSU I'm using doesn't support SMBUS/PMBUS so I can't connect it to the motherboard to get data from it. I don't think there's any modern non-server PSUs that support it - Corsair were the last company to support it via their "Corsair Link" feature but they dropped support a while back. Also most of my case fans are connected to the case's fan controller but I wonder if I should connect them to the motherboard instead. Hmm.

 

4 hours ago, Omid said:

Three PCIe slots is still plenty for an Unraid build IMO.

 

I've only got three drives at the moment, and the motherboard supports 8 SATA devices (4 onboard SATA ports plus a SlimSAS port that can connect another 4 SATA devices), so I'm set for drives. I just need one PCIe for a 10Gbps network card and one for a Google Coral TPU (which I use with Blue Iris in a VM for AI-based object detection for my security cameras).

Edited by Daniel15
Link to comment
On 8/30/2023 at 7:08 AM, Daniel15 said:

 

Seems like it works:

image.thumb.png.80173856429841b4714ded99d7a51376.png

 

I had to adjust the alerts in the IPMI web interface because I'm using a Noctua NH-D15 CPU cooler and the fans spin so slowly at the lowest setting (360RPM) that it thought the fans were broken, so I kept seeing alerts about low fan speed 😂. Out-of-the-box, the IPMI systems tend to expect the small, high RPM server fans.

 

Unfortunately the PSU I'm using doesn't support SMBUS/PMBUS so I can't connect it to the motherboard to get data from it. I don't think there's any modern non-server PSUs that support it - Corsair were the last company to support it via their "Corsair Link" feature but they dropped support a while back. Also most of my case fans are connected to the case's fan controller but I wonder if I should connect them to the motherboard instead. Hmm.

 

 

I've only got three drives at the moment, and the motherboard supports 8 SATA devices (4 onboard SATA ports plus a SlimSAS port that can connect another 4 SATA devices), so I'm set for drives. I just need one PCIe for a 10Gbps network card and one for a Google Coral TPU (which I use with Blue Iris in a VM for AI-based object detection for my security cameras).

Thanks for confirming. Is this the ATX Ace with IPMI card or mATX Ace with built in IPMI?

Edited by mikeyosm
Link to comment
2 hours ago, mikeyosm said:

Thanks for confirming. Is this the ATX Ace with IPMI card or mATX Ace with built in IPMI?

I've got the mATX with built-in IPMI. Both use the same system chip and software (ASPEED AST2600A3-GP, AMI MegaRAC SP-X, ASMB12-iKVM) so they should function identically.

Edited by Daniel15
Link to comment
23 hours ago, Daniel15 said:

I've got the mATX with built-in IPMI. Both use the same system chip and software (ASPEED AST2600A3-GP, AMI MegaRAC SP-X, ASMB12-iKVM) so they should function identically.

Nice. Where did you get the board? Shows sold out everywhere.

I also noticed there's no DIMM temp in the IPMI show, what mem sticks and cpu did you end up getting?

Link to comment
26 minutes ago, mikeyosm said:

Nice. Where did you get the board? Shows sold out everywhere.

I also noticed there's no DIMM temp in the IPMI show, what mem sticks and cpu did you end up getting?

 

I bought it directly from the Asus website. I'm in the USA but I'm not sure which other regions their store operates in. Newegg has them in stock too.

 

For the RAM, I'm using two sticks of Kingston server DDR5 ECC RAM. Model number KSM48E40BD8KM-32HM. I got them for $100 each using a employer discount with a supplier we use at work. It's also the only ECC RAM on Asus' compatibility list that seems to be easily obtainable - I couldn't figure out where to buy the others.

 

I don't see the RAM temperature anywhere, but I did see it when running memtest86, so there's probably just some sensor config I need to do? I'm not sure as sensors-detect didn't detect it...

 

For the CPU, I'm using a Core i5-13500. I usually stick to the ...500 CPUs since they're often the best balance of price and performance. My previous PC had an i5-6500 and the small form factor PC I'm replacing with this new server has an i5-9500. :)

 

If you want to do anything with the integrated graphics (Plex or Jellyfin server, security cameras, etc), the iGPU in the 13500 (UHD 770) is more powerful than the one in the 13400 and below (UHD 730) and can support more concurrent video encoding/decoding jobs. Both support SR-IOV which lets you use one GPU in multiple VMs and Docker containers at the same time (you'll need the SR-IOV plugin)

Edited by Daniel15
  • Like 1
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.