Zonediver

Members
  • Posts

    1684
  • Joined

  • Last visited

Everything posted by Zonediver

  1. Ah - that works - thanks a lot 🤣👍
  2. ...i am confused... Is this normal? Each stream x2?
  3. New release... Told me, cant be installed... Min Version 6.11.9 (???) Confused... 🤪
  4. ...and this build shall run 24/7? Pfffff... your electric bill will be glorious 🤣 For comparison: my system runs 12-15 hours per day and consumes ~34kWh per month...
  5. Maybe a cable issue... I reach 1169MB/sek max. between Win 10 and unraid.
  6. Well... is that necessary? For a forum access? That seems to me to be shooting at sparrows with cannons... I am not amused, but ok...
  7. 1) 250MB über USB 3.0 sind eh schon sehr schnell 2) Der Link funktioniert nicht
  8. command not found Kann man das nachinstallieren? Und wenn ja, wie?
  9. Since 2010, i use ASRock for unraid and all my other PC's - rock solid and (mostly) not expensive 👍 My Router (IPFire) is running on a ASRock Q1900M - since 2017 (24/7, 2075 days, 49800 Hours) and still running.
  10. ? If you don't like the answer, don't ask 👍
  11. Because a Workstaion is stronger and more flexible then a VM - at least for "some" workloads... You like to use one GraphicsCard for two users >> less power for both.
  12. ...i am not sure, if unraid and VMs are the right choice for this... This is more a "workstation case"
  13. I am not sure, if ECC is necessary for Plex only - maybe for a machine with VMs... I use unraid since 2010 for Plex only and never used ECC.
  14. Transcode to what? The iGPU of my i7-9700 can transcode 4 Streams with 4k to 1080p/8MBit without a problem - the iGPU laughs at the four streams. So the question is: What is the target resolution and bitrate...
  15. Thanks for your help - working 👍
  16. After the update to v 2023.02.05a, i see this: ...not normal (i think). Feb 6 11:18:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:19:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:20:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:21:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:22:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:23:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:24:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:25:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:26:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:27:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:28:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:29:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:30:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:31:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:32:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:33:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:34:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:35:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:36:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null Feb 6 11:37:01 Horus crond[1123]: exit status 127 from user root /usr/local/emhttp/plugins/dynamix.system.stats/scripts/sa1 1 1 &> /dev/null The Stats-Plugin was working before this update...
  17. ...and when will it come? (soon 🤣)
  18. Necessary no - but its the "most efficient way" to transcode for Plex - so yes, its recommended 👍
  19. Found a solution which works (Page #12):
  20. Thats the point... i have only one - under my cache SSD - but this is empty. (\\10.10.10.250\cache\appdata\plex\Library\Application Support\Plex Media Server\Cache\Transcode\Sessions) When i look under /tmp, there is nothing... so i am a little bit confused 🤣 I get also this fancy mails: unRAID Alert [HORUS] - Docker image disk utilization of 100% unRAID Notice [HORUS] - Docker image disk utilization returned to normal level In my case the gap is ~1min between both mails. This happens 3-4x per year...
  21. How can i check if plex is transcoding to the RAM? Is there a possibility? This link is interesting... https://github.com/binhex/documentation/blob/master/docker/faq/plex.md BUT: I have no variable named 'TRANS_DIR' - so what is this???
  22. I use three PCIe-Cards: .) A PCIe 3.0 x8 HBA in the primary PCIe 3.0 x16 slot (CPU) .) An Intel GBit-NIC in the PCIe 3.0 x1 slot (PCH) .) A PCIe 3.0 x4 10 GBit-NIC from Asus in the PCIe 3.0 x4 slot (PCH) There are no bottlenecks so far... Both NICs are using 5 PCIe-Lanes of the PCH and work with full speed - the 10 GBit-NIC reaches 9480 MBit max.
  23. Well... maybe i am too sily, but there is no "orange area"... nowere... sorry EDIT: Now i got it 🤣 These are the values of my SSD 970 Pro from Samsung: