Dephcon

Members
  • Posts

    601
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by Dephcon

  1. when i clicked on the dockers tab in webui, unifi logo was loading like 1999. no biggie, just seems like 128x128 or 256x256 is more than enough.
  2. Is it possible to reduce the size of the image you're using for the container icon? It's massive.
  3. Not sure if this has been mentioned but for the stats plugin, the network graph says eth0 even though i'm using a bond.
  4. I've had a similar idea, basically an app drive similar to cache with the ability to do multi-disk btrfs. This would allow unraid to have array, cache and app tier of disks. I'm starting to notice that plex transcoding and copying from sonarr are competing for cache ssd resources and it's causing issues for my dad who just started using my plex server over vpn. It would be really great to have a separate tier as stated above, different disk specs can be used depending on the use case. The mid-tier ssd i have for cache is great for that purpose but i'd probably get an enterprise Intel ssd for app purposes.
  5. I have a SanDisk z400, if I could go back in time I'd splurge on a 850pro or s3610 Sent from my 6055Y using Tapatalk
  6. I also had these issues with my HBAs as pass-through devices, which i noted in the b18/19 release threads. Ended up abandoning ESXi and running unraid on bare metal now, which I'm glad I did as the performance of my dockers has gone up quite a bit. There was a lot of cpu-wait overhead from unraid being a 6core VM, esp once I started using plex as a docker on unraid
  7. I'm gonna have to try and syphon some money out of savings to get a UAP now I had some for a couple of weeks for testing prior to installing at my in-laws, what a neat platform. Might also have to start saving for their new 16port POE switch too lol.
  8. I'll need to do so when my VCP comes up for renewal now that i'm running unraid on bare metal.
  9. Do you still need to be in the UK to access iplayer? I want me some essential mix.
  10. http://www.shoprbc.com/ca/shop/product_details.php?pid=14450928
  11. Screen shot of said VM Manager? Also, you will miss ESXi if you ever need to configure VLANs . As far as creating and editing VMs/properties, it's good and the vnc integration is nice as vsphere is becoming less and less functional. Monitoring is a whole other can of worms, which is not currently addressed and the vlan thing is a let down too. HOWEVER none of that really applies to me for home use and couldn't care less. The real bummer is that the FIO cards I was using for vmdk/virtual cache disk don't work on slackware so I had to replace them with a standard SSD for my cache/domain storage. But with that said, i like having all 8 cores accessible to unaid and my docker containers without all the io-wait associated with having a large core count VM.
  12. switched back to bare metal, seem to be working great now. Also just wanted to say that the VM manger is pretty darn good, i don't miss esxi much
  13. http://www.ncix.com/detail/sandisk-x400-512gb-2-5-sata-df-125375-1101.htm It has similar performance to the 850 EVO, slightly lower random write IOPS but slightly higher endurance rating for $40-50 cheaper. Been waiting for a deal on an EVO when this popped up so I took a chance, it's a brand new drive from SanDisk. Not sure is 160TBW over 5 years is enough for my cache/docker duties but the price is right and I can look at an Intel 36x0 series SSD if it doesn't work out in a couple years. Should be getting mine today
  14. I'm also having issues with this 4.6.0 update. Some times I lose connectivity so I restart the container and it revers back to the previous version, which it auto updates again and then I lose VNC. This has just been happening recently, had no issues before.
  15. +1 for LSI, they're simply the best you can get. Everyone prefers them and everyone supports them for pretty much everything. I've been battling with Dell/HP for years to offer stock LSI cards for our purchases, they're just now starting to offer them as they're realizing no one wants to deal with their shit firmware. Seems the popularity growth of opensource storage has finally got the message across.
  16. keep in mind that the SASLP(SAS1) doesn't support 3TB+ drives, you pretty much need to buy SAS2 cards these days.
  17. I really hate manufactures that aren't upfront about their TBW/DWPD rating.
  18. i've got 2 of those in my server and no issues here. Super weird. I reverted back to beta18, same deal no disks. I reverted further back to 6.1.9 and my cards were detected then upgraded to beta19 with no issues....
  19. All my disks are missing now (all on M1015 IT mode) vault13-diagnostics-20160318-1743.zip