Jump to content

Problem adding 2 identical ssd's to unraid.


MNM87
Go to solution Solved by JorgeB,

Recommended Posts

Hello i just bought 2 ssd's (MKNSSDTS2TB-D8)

but i can only see one in unraid. i can see bought in bios.

before i had a kingston in one of the m2 slots and if i use the kingston and one of the new drives it works fine.

I also tried the other new ssd with the kingston and then it works fine to. its just when i add bought new it only shows one drive.

I think its maybe because unraid cant tell the difference between the 2 drives.

 

Any idea how to make it work? 🙂

Link to comment
9 minutes ago, MNM87 said:

Hello i just bought 2 ssd's (MKNSSDTS2TB-D8)

but i can only see one in unraid. i can see bought in bios.

before i had a kingston in one of the m2 slots and if i use the kingston and one of the new drives it works fine.

I also tried the other new ssd with the kingston and then it works fine to. its just when i add bought new it only shows one drive.

I think its maybe because unraid cant tell the difference between the 2 drives.

 

Any idea how to make it work? 🙂

You are likely to get better informed feedback if you attach your system’s diagnostics zip file to your next post in this thread.

 

there can be a problem if the drives do not report different serial information 

Link to comment

I see entries like:

Feb 10 20:08:43 NAS kernel: nvme nvme0: missing or invalid SUBNQN field. Feb 10 20:08:43 NAS kernel: nvme nvme1: missing or invalid SUBNQN field.

 

Feb 10 20:08:43 NAS kernel: nvme nvme0: missing or invalid SUBNQN field.
Feb 10 20:08:43 NAS kernel: nvme nvme1: missing or invalid SUBNQN field.

And

Feb 10 20:08:43 NAS kernel: nvme nvme1: globally duplicate IDs for nsid 1
Feb 10 20:08:43 NAS kernel: nvme nvme1: VID:DID 1dbe:5216 model:MKNSSDTS2TB-D8 firmware:2.0.0.14

In the syslog.

 

maybe UD uses a different way of identifying the disks that Unraid itself does.

Link to comment
  • 11 months later...

I have this same problem with two SAMSUNG 970 EVO Plus SSD 2TB NVME drives that are installed on a Dual NVMe PCIe Adapter, RIITOP M.2 NVMe SSD to PCI-e 3.1 x8/x16 Card installed in an HP Proliant MicroServer Gen10+ server running Unraid 6.12.6.

 

Both nvme devices are detected by the BIOS, and both are listed in the list of devices that could be booted from.

Note that I see the same issue when I boot a more or less standard Linux (Linux Mint 21.3) from the USB and also when I boot up a fairly recent version of PartedMagic.

But, when I put the card and drives into an older HP Z210 workstation, both drives are visible under both Windows 10 and an older version of Linux Mint.

I guess that I am surprised that Samsung drives would see this issue.

rimviewserver-diagnostics-20240202-1340.zip

Link to comment
13 hours ago, Dwight said:

I have this same problem with two SAMSUNG 970 EVO Plus SSD 2TB NVME drives that are installed on a Dual NVMe PCIe Adapter, RIITOP M.2 NVMe SSD to PCI-e 3.1 x8/x16 Card installed in an HP Proliant MicroServer Gen10+ server running Unraid 6.12.6.

 

Only one device is being detected by Linux, this is not a software problem, try swapping them around and see if the other one gets detected, check by the serial number, could be an adapter problem, since it doesn't need PCIe bifurcation, it should not be a board issue.

 

 

Link to comment

Did that previously.  Seemed to be tied to the slot the NVME drives were inserted in.  I am loath to swap them around, since the one that is visible now has pool data on it.  Might plug it into my other workstation where both drives are visible and clone the one with data over to the other one.

 

I thought it was similar to this problem, because from the syslog.txt, it is throwing the missing or invalid SUBNQN field error.

 

Feb  2 13:30:25 RimviewServer kernel: nvme nvme0: pci function 0000:0a:00.0
Feb  2 13:30:25 RimviewServer kernel: scsi host1: ahci
Feb  2 13:30:25 RimviewServer kernel: scsi host2: ahci
Feb  2 13:30:25 RimviewServer kernel: nvme nvme0: missing or invalid SUBNQN field.
Feb  2 13:30:25 RimviewServer kernel: nvme nvme0: Shutdown timeout set to 8 seconds

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...