Jump to content

huntjules

Members
  • Posts

    105
  • Joined

  • Last visited

Posts posted by huntjules

  1. All setup now. Unraid already supported rsync. I just had to configure the QNAP remote end with correct user share credentials, SSH, encrypted port number and fiddled with QNAP drop down menu options and after a few options were tried, on 4th QNAP option, the qnap could see UnRaid shares.

     

    If others interested, screen snip below of what options got working for me

    image.thumb.png.06fcd5093953c6703ddb2724ef08a682.png

     

    • Like 3
  2. Hi I'm wanting to enable RTRR or RSYNC on the UnRaid please. As I want to have UnRaid for H.A production traffic in a secure environment. But needing a SnapShot backup to a less secure location, but on the same network, that can see and take snapshots from the UnRaid production NAS / shares (Yes the Snapshot will be stored on the offsite QNAP Backup server for site redundancy / DR, as part of the backup/replication strategy). 

     

    And QNAP QTS NAS has that feature in their Hybrid Backup, but need RTRR or Rsync enabled to be enabled on UNRaid.

     

    I'm preferring not to use duplicati (But using as a temporary workaround at the mo). As finding it a bit finicky in database errors for large storage with lots of files. (plus a few other niggles) And QNAP just seemed more elegant / robust for my cheaper backup requirement. Plus prefer to take workload off the UnRaid for Backup jobs. As Plex users can sometimes be affected when duplicati is doing a backup... and QNAP with Backup software was only £70 for the TS128A 🙂

     

    I've added screen shot of UnRaid showing what I have.

     

    I've added screen shot from QNAP Hybrid back (HB3) that support RTRR (real-time remote replication ) and Rsync - Asking for guidance how to enable RTRR or Rsync to my UnRaid please?

     

    image.png.74c4500ea8d01b7de396412ce2138b57.pngimage.thumb.png.8f194f55db1672de6d085fd2c9e76168.pngimage.thumb.png.077e3506dadc0b65caebe4203c920a02.pngimageproxy.php?img=&key=e5eec7c5c933ca16image.thumb.png.5a559643517f111576b3bae00d780f54.png

     

     

  3. On 7/14/2019 at 8:55 PM, huntjules said:

    I prefer running stable code. plus needing Version 1.16.1.1291 as currently my plex (Version 1.16.0.1226) can't populate EPG and wont record anything.. quite frustrating)

     

    May I ask any news please. As Plex have been naughty in taking out a crucial feature from their stable release. and making that only available on their Pass release. Will Limetech look to release the  1.16.2.1321 anytime soon please?

  4. On 7/1/2019 at 6:05 PM, Squid said:

    If you want to always run the latest, greatest, (and occasionally bug ridden) version of Plex, then you should instead ru the version from linuxserver.io

    Sent from my NSA monitored device
     

    On 7/12/2019 at 3:14 AM, Pim Bliek said:

    It's not too much to just politely ask, right? I'm also curious when we can get the new version on this Docker image, since now my EPG is not working anymore. I did manage to work around it with an EPG-XML grabber and using that but it's far from perfect...

     

    So, as politely as I can: can we have an estimate on when we can expect a new version?

    On 7/1/2019 at 6:20 PM, lordofiron said:

    Well in this circumstance it's  a necessity to me. I use plex to record ota programming. It's my understanding that you can ONLY receive new epg data with the newest version of plex 1.16.1 and up. It's not that I have to run the latest and greatest.  I've been running limetechs docker for about a year and only updated twice in that time. Just the fact that a key functionality that I depend on is now broken, that's the reason I'm asking.

    I prefer running stable code. plus needing Version 1.16.1.1291 as currently my plex (Version 1.16.0.1226) can't populate EPG and wont record anything.. quite frustrating)

     

  5. On 6/28/2019 at 12:46 AM, LesterCovax said:
     

    I had to roll back to that version due to tracker whitelisting as well (and even got a nastygram from an admin asking for a lot of proof on my setup due to inconsistencies).

     

    You need to re-add the actual `*.torrent` files for the torrents you had active AFAIK.  I first moved/copied everything from the `completed` folder to the `incomplete` folder.  I then copied all of my `*.torrent` files from the `/data/.torrents` directory to my `/data/.torrents_add` directory, which is configured to auto-add any torrents in that directory using the `autoadd` plugin.  It will populate the torrents for every `*.torrent` file you added and should then check the progress against what you moved from your `completed` to `incomplete` directory.  You can select them all and choose "Force Recheck" if it's not doing it for some reason.  Then just wait a long long time depending on how large your torrents are.  For any that I still don't have the `*.torrent` file for, but the files were in `completed`, I just check a list on the tracker itself for torrents I haven't fully seeded and redownload the torrent file, or just manually find it on the tracker if it's not on the list.

     

    Royal PITA, it is

    I'm not sure if its related. But I had an issue with downloads stopping from my private tracker. I noticed it only happened after last update from 

    John Garland, Download and import IP blocklists. I've just disabled the Block list plugin and restarted Deluge and downloads started again.

  6. On 3/27/2017 at 11:50 AM, Neogola said:

    i have issues with version 1.5.1.3520 with the web Client. All Browser says "convert failed..." with the version before all works great! 

    I need instruction to downgrade the version 1.5.1.3520

     

    Thank you

    I'm not sure if its related. But I had an issue with downloads stopping from my private tracker. I noticed it only happened after last update from 

    John Garland, Download and import IP blocklists. I've just disabled the Block list plugin and restarted Deluge and downloads started again.

  7. On 6/27/2019 at 6:58 PM, kelmino said:

    So I had auto-updates turned on and saw that it updated Deluge to  2.0.  Unfortunately, some of my private trackers do not have that whitelisted, so I went ahead and downgraded with the following build.

     

    binhex/arch-delugevpn:1.3.15_18_ge050905b2-1-04

     

    When I did that, it booted up just fine, except it lost all of my torrents (All my settings seem to be good).  I had over 200 torrents.  is there a simple way to re-add all of my torrents?  I have them all moving to a completed folder when they finish, so I didn't think I could just re-add the torrent files because I thought it would just re-download them all unless I pointed over to the completed folder, which is not something I'd really want to do at this point.

     

    I tried to restore from a previous CA Backup / Restore Appdata from a few days ago, but after I copied over the files it still booted up with zero torrents in the program.

     

    I've attached a picture of my completed torrent folders to show where they download and move to. (I also have labels set up to auto move completed to folders based on those labels.

     

    Any help would be appreciated, thanks!

     

    Screen Shot 2019-06-27 at 1.55.11 PM.png

    Screen Shot 2019-06-27 at 1.57.27 PM.png

    I'm not sure if its related. But I had an issue with downloads stopping from my private tracker. I noticed it only happened after last update from 

    John Garland, Download and import IP blocklists. I've just disabled the Block list plugin and restarted Deluge and downloads started again.

  8. I'm not a Linux expert, so hoping to ave some guidance in what to type in to the CrashPlan/Code42 for small business docker to increase the Ram to 4gb. The wording doesn't tell me (as I'm a newby) what I need to type to field to increase RAM to 4Gb.. any guidance appreciated please?

     

    CRASHPLAN_SRV_MAX_MEM Maximum amount of memory the CrashPlan Engine is allowed to use. One of the following memory unit (case insensitive) should be added as a suffix to the size: G, M or K. By default, when this variable is not set, a maximum of 1024MB (1024M) of memory is allowed.

    image.thumb.png.37011d71c2a0b9ddbec151137240f9ba.png

  9. On 12/26/2017 at 1:04 AM, Djoss said:

     

    First, appdata from your old container is not fully compatible with this one.  So the copy you did is useless.  Normally, you would have start with an empty appdata.

     

    Then, you can skip the file transfer without issue.  The wizard assumes that the current device doesn't have a local copy of the data in the cloud, which is obviously no the case.

     

    Since the file paths between the old and the new container is different (your files are under /storage in the new one), you will need to re-select your files under the correct path, then perform a backup (without removing files marked as missing, which are under the old  /mnt/user path).  Because of deduplication, nothing will be re-uploaded.

     

    All these instructions can be found at https://github.com/jlesage/docker-crashplan-pro#taking-over-existing-backup

     

     

    @Djoss As I have my monthly configs backup in a different folder. I deleted all previous Crashplan dockers and images. re-downloaded fresh docker, and lettering the docker do its usual stuff. I'm re following the "Taking Over Existing Backup" guidance in GitHub from link you sent. hopefully this will help. Appreciate your guidance! Cheers, Julian

  10. Hi I'm really confused here. I previously migrated from Crashplan home to previous Crashplan Pro app (gfjardim/crashplan:latest) went through migration to pro and all seemed to work ok using Pro version.

     

    Now appears I now need to use jlesage/crashplan-pro:latest docker instead. Obviously wanting to re-use my existing config, so I added a few folder / path (/mnt/user/appdata/CrashPlanPRO) and copied gfjardim appdata config content to that (see below).

     

    image.thumb.png.bad163e790164546301c975377640530.png

     

    Docker starts ok, I'm able to log in with usual credentials, I choose my existing tower, but then I'm asked to choose files to transfer or skip. if I choose file transfer, appears I need to choose files on my crashplan online backup, but if I choose skip files I receive a scary message (see below)

     

    image.png.fdcbc91fb4258f7e24f2c0eb12ec34e1.png
    All i want to achieve is to sync my data on tower to my online crashplan, so my backups can restart. (obviously I don't want to resend the TB's of data already synced as that might take about a month) - Some guidance appreciated please as dont want to mess things up.

     

    Migrating to jlesage crashplanPro.pdf

  11. Hello, I've been using OpenVPN really well via PC and Iphone and worked really well. I recently did the upgrade to the latest version  2.1.12  and since then I cannot connect to VPN from any device.

     

     

    I'm able to log in and download the user profile for the user & I'm able to log in as admin and configure the VPN server, but I'm not able to connect using an of the profiles

     

    local auth failed: password verification failed: auth/authlocal:42,web/http:1609,web/http:750,web/server:127,web/server:134,xml/authrpc:110,xml/authrpc:164,internet/defer:102,xml/authsess:50,sagent/saccess:86,xml/authrpc:244,xml/authsess:50,xml/authsess:103,auth/authdelegate:308,util/delegate:26,auth/authdelegate:237,util/defer:224,util/defer:246,internet/defer:190,internet/defer:181,internet/defer:323,util/defer:246,internet/defer:190,internet/defer:181,internet/defer:323,util/defer:245,internet/defer:102,auth/authdelegate:61,auth/authdelegate:240,util/delegate:26,auth/authlocal:42,util/error:61,util/error:44

     

    btw - I'm using Local for password saves. I have SSHed and updated the admin passsword. And after I noticed issues with users, I've SShed and tried to re-do user passwords (just incase the local save had deleted passwords during the upgrade)

     

    ANy guidance please?

     

    Cheers, Julian

  12. On 6/12/2017 at 8:09 PM, huntjules said:

    @gridrunner loving your video's really appreciated. following I managed to get Deluge with VPN working well. just a quick question if I may, I see Deluges default port selection for incoming is fixed and outgoing is random, do both of these need to be the same and fixed to allow sharing of my downloaded torrents and the appropriate port rule added to my router? Or with VPN setup will these files automatically accept incoming connections  / sharing the files via VPN tunnel?

    I setup port forwarding on my router and I seem to be able to share/upload downloaded content with vpn enabled, so hoping it was just a newby error.

  13. On 5/18/2017 at 10:56 PM, tiny-e said:

    Just installed.  Any time I try to queue or convert I get an error complaining about can't read/write to the directory (ones I chose in the setup).

     

     

     

    Hello, I have the same issue with read and write permissions, I've checked & read and write is enabled for both source and output folders (I've enclosed pdf doc, showing pics of configs, errors and cut & pasted logs, any guidance appreciated)?

     

    Handbreak read&write issues.pdf

  14. On ‎18‎/‎05‎/‎2017 at 11:30 PM, Djoss said:

    Did you mapped the '/output' folder?

     

    Under the "Docker" tab, all the mappings are shown in the "Volume Mappings" column.  You can copy-paste them here if your want.

    Hello I have the same problem with read/write permissions, I've set my output to /mnt/user see photo - Any guidance please?

     

     

    handbreak config.PNG

  15. @gridrunner loving your video's really appreciated. following I managed to get Deluge with VPN working well. just a quick question if I may, I see Deluges default port selection for incoming is fixed and outgoing is random, do both of these need to be the same and fixed to allow sharing of my downloaded torrents and the appropriate port rule added to my router? Or with VPN setup will these files automatically accept incoming connections  / sharing the files via VPN tunnel?

  16. On 5/23/2017 at 5:08 PM, eschultz said:

     

    Latest version of Plex has been posted.  Check for updates on the Docker tab and a update should be available for Plex.

    Hi Where do we find change log for latest limetech plex docker please? (wanting to make sure it's worth the hassle of upgrading)

  17. 11 hours ago, unevent said:
    11 hours ago, unevent said:

     


    Change your install and config path to include cache drive, ex: /mnt/cache/config/filebot...

    Make 'config' a cache-only share.

    Sent from my ASUS_Z00AD using Tapatalk
     

     

    thank you unevent I now have Filebot running on the cache drive.

     

    Any guidance on how do I configure what folders filebot watches, and which folders filebot renames and moves? As I can only see this screen below and doesn't seem obviously how to setup Filebot to do its tasks.


    Change your install and config path to include cache drive, ex: /mnt/cache/config/filebot...

    Make 'config' a cache-only share.

    Sent from my ASUS_Z00AD using Tapatalk

     

     

    Filebot config.PNG

  18. Hello, I have 2 questions please;

    1) I've installed the Filebot plugging, see config attached, its running, but where do I configure, folders and actions please?

    2) I see a warning about only installed in RAM, I have a large cache drive, how do I put this onto the cache drive rather than the RAM please?

     

    Appreciate the guidance

    Filebot config.PNG

×
×
  • Create New...