Robb3rt

Members
  • Posts

    10
  • Joined

  • Last visited

Converted

  • Gender
    Undisclosed

Robb3rt's Achievements

Noob

Noob (1/14)

0

Reputation

  1. So I replaced the Samsung SSD with a new Samsung SSD. In the process of converting the old to the new disk, I see a few BTRFS filesystem errors being corrected, also the smart values of the old SSD are not great. I am using the new SSD for a few days now, and it seems a lot better. For the first in a long period I see the read \ write speeds above the 200MB/s. So in my case I think it was just a bad SSD..
  2. Yes, I have Dynamix SSD TRIM installed. Thanks for the tip
  3. So after a few days I was able to remove those 2 Intel SSD's. First convert the RAID1 to a single disk config ; btrfs balance start -f -dconvert=single -mconvert=single /mnt/cache Then I was able to remove the first SSD, after the balance was complete I removed the last SSD : btrfs device remove /dev/intel-ssd /mnt/cache The performance seems better, but the performance is not at the point that I've hoped. Those 3 SSD's are like 5/6 years old now, so just to be sure I ordered a new SSD.
  4. I fixed the issue, by running btrfs-select-super -s 1 (on both intel SSD's) All credit goes to the almighty god Multicore on #btrfs.
  5. Some updates from my side, I've been searching online and found something. On some forums they talked about that on a device there are more then 1 superblocks, so with the command btrfs inspect-internal dump-super -s 1 /dev/sdx1 I've been did found the superblock. But I don't realy now how to restore it in the correct way..
  6. Is the superblock something like a partition? In the past I have some experiance with partition recovery, so can I try just to restore the btrfs \ superblock? On my Samsung SSD there is still a btrfs partition, but on the other 2 devices the partition is Linux \ unknown.
  7. Hi, So I did something stupid ... I've been using unraid for several years now, never had any issues until now. I've stared with a single caching disk in the beginning, in the beginning of this year I added 2 caching disk for safety. The last few weeks I've experienced some poor performance from the caching pool, the read \ write speeds not going higher then 30 Mb\s. So I realized that this problem occurs since I've added those 2 extra SSD's for caching, so I thought I've just remove the 2 SSD's en see what that will do with the performance. So I just stopped my array, unassinged those 2 extra SSD's and started my array again, but the unraid gave the error that the btrfs filesystem is Unmountable: No file system. So I quickly stopped the array again, put back the original config of those 3 SSD's and try to start the array again.. But yes.. I already fucked up the btrfs filesystem because it is still Unmountable. So then I started searching online how to recover my btrfs cache pool, then I quickly realized that did not enough reading about the btrfs system and its procedures. What I tried so far, start the array in maintenance mode and try to check \ repair the filesystem. But somehow the btrfs raid is missing those 2 extra devices. (warning, device 2 is missing) I am able to mount the system in degraded,ro mode but the data seems to be corrupt when I try to access or copy it. Also when I try a restore, it partially restores files but on like 50% of the data I get an error; Trying another mirror ERROR: exhausted mirrors trying to read (3 > 2). In my opinion theoretically it can be fixed because the disks are fine and I did nothing with the disk, but I seem to be stuck here. Do you guys have some experience or tips for me? My cache setup: Disk 1 : Samsung 840 EVO 250GB Disk 2 : Intel SSD SA2CW160 160GB Disk 3 : Intel SSD SA2CW160 160GB So if you need more info or debugging or logging let me know!
  8. +1 Is there an option for add this as an option to turn it on or off?
  9. I already fixed it by replacing the motherboard with an Asus Sabtertooth 990FX. After that I replaced my hardware with Intel for better performance with unRAID
  10. Hi, I'm new to this forum.. and sorry for my bad English! I have an Unraid system with the following specs; Motherboard ; Gigabyte GA-990FXA-UD3 CPU: AMD FX8350 RAM: 4x 4GB DDR 1600 MHz GPU1 : NVIDIA EN8400GS GPU 2: Asus R9 280X Caching: 2x Samsung 840 EVO 250GB 1x Samsung 850 Evo 250GB HDD: 2x WD Red 3TB When I try to install an Windows 7 VM I have the following issue, after loading the GPU driver Windows wont boot (BSOD 7e) when I boot to safe mode and remove the GPU driver Windows boot normally. I'm stuck at this point.. anyone can help me? Cheers!