Jump to content


  • Content Count

  • Joined

  • Last visited

Community Reputation

4 Neutral

About Fredrick

  • Rank
    Advanced Member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hey, I've got this docker setup and running with Organizr as the frontend, it has been working great! Now I'm developing a .php page that I want to try while coding it. Is there an easy solution to use this docker to serve the .php without messing with the rest of my setup? For now I'd like to just have it served locally
  2. This is how I found out. Push notification from "fix common problems"-error. Thanks!
  3. Did you ever fix this issue? I seem to have the same thing.. The VM starts fine without the passthrough, but fails as soon as I set it to passthrough the USB-controller EDIT: Changed the drive to IDE instead of VirtIO, and it works now. Unsure if that causes a performance hit tbh
  4. Gonna answer myself here. Just watched @SpaceInvaderOne's fantastic video about this. I seem to have forgotten quite a few steps from the last time I did this. After appending the controller ID I've got it working. Thanks! Still struggling with it not finding the boot device. Guessing it tries to boot from the USB-controller somehow..
  5. Hi, I just installed a new PCIe USB controller in my Proliant M350 due to needing more ports. I came from a 2 port controller to a 5 port controller. I want to pass the entire controller through to a VM like I did before. At first boot it allowed me to do it, but the VM wouldnt boot. After trying again the controller has since disappeared from the available "Other PCI devices" list. Where did it go, and how can I get it back?
  6. Hi, I'm trying to get a Stash-docker up and running, and its dependant om ffmpeg for transcodes. Didnt really know where to start, so I've installed ffmpeg in /user/local/bin/ffmpeg, and I can call the command just fine (doesnt complain about dependencies). Followed this script, just editing the link for ffmpeg. This user seems to be running the app on Unraid, but I'm not smart enough to decipher what he is doing just from this script/image. https://github.com/stashapp/stash/issues/30#issuecomment-477737548 Im seeing this error in the docker-log, but don't understand much of this either: time="2019-05-22T12:05:52Z" level=error msg="ffmpeg error when running command </usr/bin/ffmpeg -v quiet -ss 131.74200000000002 -y -i /data/[MEDIAFILE].wmv -vframes 1 -q:v 2 -vf scale=1280:-1 -f image2 /generated/screenshots/77c4cca450631a7f3fec0803e9bfdbda.jpg>" My current docker setup, I've tried various things without luck mp4 and m4v play just fine, but wmv results in an error. Thanks!
  7. Is there an easy way to switch to the regular Plex container from this Plexpass container? Couldnt be as easy as changing the repository?
  8. Access to my server via apps (Android, Libreelec) are somehow gone and I cant figure out what I can do to fix it. I can access the server fine through webui, both local and WAN. The server has green checkmark under network. I cant see anything in the log for the container, and not sure where the logs are for Android app. Maybe I could get a log for libreelec? I've tried disabling SSL on the server to rule that out, I've tried downgrading PMS and I've tried rebooting pretty much everything multiple times. Any ideas?
  9. Anyone using end2end-encrytpion and got it working succesfully? I've got it set up, and it works for adding a small amount of data at a time. Adding multiple photos or large videos makes the entire thing crash. It seems to be unstable at best. My android client can see the files, but not delete them. Sometimes they show up as encrypted even though they should be unencrypted. It stores the files in an uaccessible tmp folder which still has the files after the upload crash. Any tips? EDIT: The github for the app also contains plenty of unresolved issues, so it might not be ready.. unfortunately.
  10. I took a backup of what I could find, and reinstalled the plugin. The config files didnt get removed during uninstall, so the backup was not necessary. It works now after the reinstall
  11. Thanks for a solid answer! Unfortunately I didnt get very far. As far as I can tell rcloneorig is the actual rclone-process which is simply just called with "rclone", I might be wrong. root@Tower:~# sed -n '21p' /usr/sbin/rclone rcloneorig --config $config "$@"; root@Tower:~# which rcloneorig which: no rcloneorig in (.:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/bin:/usr/bin:/bin) root@Tower:~# find / -type f -iname rcloneorig find: `/proc/30625/task/30625/net': Invalid argument find: `/proc/30625/net': Invalid argument find: `/proc/30626/task/30626/net': Invalid argument find: `/proc/30626/net': Invalid argument find: `/proc/30627/task/30627/net': Invalid argument find: `/proc/30627/net': Invalid argument find: `/proc/30641': No such file or directory find: `/proc/30666': No such file or directory
  12. Hi, I'm not sure this is related to my recent upgrade to 6.6.1 (from 6.5.3), but I hadn't noticed this error before. I can no longer call rclone through userscripts, even though the plugin is installed. It results in the following error message: /usr/sbin/rclone: line 21: rcloneorig: command not found I'm not sure if reinstalling the plugin will result in a loss of my configurations, or if I can simply backup the config and scripts from the plugin page to keep all necessary information. Any ideas?
  13. Sounds like a good idea, thanks for a solid explanation that would also give me remote access and easy access on mobile which is a perk How is the speed? Are the files you work with actually stored locally and then synced?
  14. Hi guys, I'm frustrated with my own lackluster workflow with regards to documents and other files that are actually quite important to me. Before doing a major renovation I thought I'd hear what you guys are satisfied with. In general I'm happy with files that are not actively worked on like media. However I create documents and other files across various platforms which makes them unaccessible and hard to control and backup. - I use OneNote for both private and work-related notes. At work and on my work-computer (Windows) we use OneDrive, and it I sometimes use this laptop to also work on private files which then end up in OneDrive. - I've got a Windows workstation that has project files like Sketchup and other items. These are stored locally and then backed up (with Windows backup) to my Unraid Server. - I've got a Macbook which also holds documents and files. These are backed up (with time machine) to my Unraid Server - My Unraid server syncs my backup-share to my unlimited Gdrive. When revisiting my workflow I want to be able to work on the same documents on both my workstation and Macbook and have some of my files accessible on my work-laptop. I want to reduce duplicates to simplify cleanup and retaining the newest versions. So I'm thinking I should consolidate this to work directly on a Unraid share with my files. This would solve the lack of connection between my Mac and Windows (and VMs), and my files would immediately be on a parity protected disk. Unraid could then push all my backups to Gdrive (cold storage) and my document folder to OneDrive (sync so changes are updated both places). What I would loose is the local copies on my workstation/laptop, but I guess I could somehow push the important files to my workstation with rsync. Any tips?
  15. @JonMikelV thanks, any idea what I could do to get past this error? syslog: