jdag

Members
  • Posts

    29
  • Joined

  • Last visited

Everything posted by jdag

  1. I am hoping someone can help, and I am admittedly not very conversant in using unRAID. I set it up years ago, and it has been working flawlessly since. Obviously a great testimonial to its capability! But now I am getting what is likely a simple error via the Fix Common Problems plugin. Please see the message below, and advise "as if I were 5 years old" (as Michael Scott said in The Office). Thanks in advance, John The following user shares will be excluded from the permissions tests: /mnt/user/appdata Processing /mnt/user/Backups Processing /mnt/user/docker.img Processing /mnt/user/Media Processing /mnt/user/Misc Processing /mnt/user/Photos The following files / folders may not be accessible to the users allowed via each Share's SMB settings. This is often caused by wrong permissions being used on new downloads / copies by CouchPotato, Sonarr, and the like: /mnt/user/Media/.Trash-99 nobody/users (99/100) 0700 /mnt/user/Misc/.Trash-99 nobody/users (99/100) 0700 Directories Scanned: 3640 Files Scanned: 104369
  2. This is precisely what I am doing. I recently subscribed to the Office 365 family plan (I needed to upgrade from Office 2011 for Mac anyhow). I set up each family member (3 of us) with a MS account, and each person now has 1TB of OneDrive space (as well as the latest Office version). I then installed Duplicati on each of our 3 computers, each backing up to a personal OneDrive location. I then created a generic MS account for my unRAID server. I set up the Duplicati docker, and used that generic login to use its OneDrive destination. More-or-less, I am sending my photos from unRAID up to OneDrive. I still have 1 more Office 365 account I can eventually use. For $100/year, it is quite a good value. In fact, I am saving $50/year over what I was spending with CrashPlan, albeit with a more complex setup. But so far it is working very well.
  3. I am certainly no IT expert, so I want to handle as simply as possible. And with as few "extras" as possible.
  4. Interesting. I am using a Mac, so maybe it is different. Within Duplicati, I am able to select "Local folder or drive" and then select the volume and share in the "Folder Path" box. I have been testing it, and it does work. Although, maybe the FTP method is a better one?
  5. Thanks a ton, Ambrotos. It certainly sounds like you and I are on the same page. And I agree that using the offsite sources to restore would only be needed in the most extreme situations, in which case I'd likely have a lot of other pressing issues on my mind! I do have a question for you. You said "So I installed Duplicati on each of my family's 4 PC's which back up to a "PC-Backups" share on unRAID via FTP (making use of SlrG's ProFTPd plugin)". Why did you do that as opposed to just backing up using Duplicati's default "Local Drive" option? As for Backblaze B2, I hadn't looked into that option. I was leaning to Amazon Cloud Drive or MS OneDrive.
  6. I need to come up with an alternative to CrashPlan ASAP as my contract expires later this month. I am not going to continue with their business plan. I've decided to use Duplicati and I believe that I've narrowed my options to 2 alternatives, and I was hoping from input from those here. In terms of background: I have 3 Mac laptops and 1 unRAID server. Virtually all of the laptop data is recoverable in the event of an issue. Each laptop is backed up locally to a Time Capsule, and even if those backups are compromised, nearly all data/files are kept within cloud-based email and/or Dropbox (yes, I understand Dropbox is a sync service and not backup). I am only saying that there are multiple layers/copies of files already, so I am not overly concerned about the laptops. On the other hand, I am very concerned about getting off-site storage for my unRAID server. In total, the unRAID server holds ~8TB of videos/music and ~1TB of photos. The videos/music are copied off to external hard drives that I keep at my parent's house (and of course don't often change). The photos are my #1 priority, and I add to and edit those photos very often, so a constant backup solution is important. My 2 alternatives are: 1) Backup each computer to unRAID, then unRAID to cloud (I would start with the folders that hold the laptop backups as well as the folders holding photos. I may eventually add the video/music folders to the backup, but again not urgent). 2) Backup each laptop plus server individually to cloud (similarly, not the videos/music to start). The reason #1 is more attractive to me is that I then have a "central connection" to the cloud storage. I would not have to worry about multiple cloud accounts, checking each laptop to assure connectivity, etc. Thoughts or suggestions would be very appreciated! Thanks in advance, John
  7. Thanks a ton for confirming. I checked the Plex DVR forum and there supposedly is a new version. But, when I look at the version number it is the same as the old one. I wonder if someone made a mistake in reporting the new version ID.
  8. Yes, I see that same error "ERROR: cannot verify downloads.plex.tv's certificate, issued by 'CN=DigiCert SHA2 Secure Server CA,O=DigiCert Inc,C=US':" I suppose this means that there is a wider problem and it is not just my side?
  9. Right...I just intended to say that there was "something" that happened between the time everything worked until now. I did not want to sound as though "nothing changed". I've PM'd the info you requested. Thanks for the support!
  10. Yes. The Version field is set to the Plex Pass Beta version (which I don't want to post here as it is discouraged). That field has not been touched since the last version change I made, by memory maybe about 8 weeks ago. It has been working for me since that last version change. I recorded and watched a DVR video on Wednesday, so this issue has presented itself sometime between ~10pm Wed and ~5pm today when I noticed that the DVR functionality was missing in the Plex client. I then looked at the Plex Server settings and found that I am somehow on 1.3.3.3148. I believe, but am not positive, that there was a container update today or yesterday, but again, my version is correct. I assume that's what has cased this. Any idea how to remedy?
  11. I just noticed that the Plex DVR capability has disappeared. I have Plex Pass, and had edited the version to pull the Plex DVR Beta version, and everything was fine up until Wednesday night (I recall Wed as I recorded and watched a show that evening). Now, my server version is 1.3.3.3148, and I cannot seem to run the lower Beta version (I don't want to post the Plex Pass Beta version here). Any ideas? Thanks!
  12. Is it possible to access some of the dockers I am running on my unRAID server from outside of my home network (PlexPy, Deluge)? And if so, how? Is it advisable to do so? Thanks!
  13. Hoping someone can help... I cannot connect at all. I installed using the default settings and when I try to access via VNC using the port 5900 I receive a connection dialog box, then a request for a password, but nothing thereafter. Thanks, John
  14. No, container updates for plex are triggered because we updated something in the container (may not be significant as builds are triggered automatically each time there is a github PR merged), you set the version yourself in the template and this will not be affected by the container update. Thanks
  15. I am using the Linuxserver.io Plex docker, have PlexPass, and have updated the version variable as needed to access the Plex DVR beta. Today I see that Plex Docker is reporting an update. I assume that I should not apply that update since I am already using a beta version, correct?
  16. Thanks for the help. I hate having to make these tweaks/modifications, but it is the nature of the beast I suppose.
  17. Thanks, it worked for me too. Is this a permanent fix or will a future update to Deluge overwrite the ":12"? Or should we remove ":12" at some point? I assume this was necessary following the upgrade to unRAID 6.2, right? Thanks, John
  18. Thanks so much for the explanation. Krusader seems to be my answer. What I had been doing was moving the full folder over to my local machine, uncompressing, then moving the uncompressed file back to the server.
  19. Thanks for the input. I did install Krusader and it worked wonderfully. So I will use this method going forward. I am still confused about the "old" method. I am wondering if the other method I was using was related to the utility I am using on my Mac: The Unarchiver? Could it be the minimal RAM I have on my unRAID box? I have tried using the SSD cache drive completely, but no luck, still unusably slow. As for my setup, I am not sure exactly what info you'd need, but here goes: Lenovo TS140 i3-4150 3.5GHz 4gb RAM Sandisk 480gb SSD 3x6gb WD Red drives (1 parity, 2 data) Thanks!
  20. Is there a trick to being able to extract RAR files on a drive array? Whenever I try it is incredibly slow (or does not finish). So instead of extracting on the array, I am copying the folder over to my local drive, extracting, then copying the extracted file back. I would think there has to be a better method. I have tried several different methods (locating the folder on the cache drive, extracting to the cache drive from the array) with no better performance.
  21. Thank you. I will wait until my iTunes Organization is complete as I do not want to interfere with it. I will then grab the log file and attach here.
  22. I am brand new to unRAID, but have what I feel is more-than-adequate brand new hardware: Lenovo TS140 i3 3.5Ghz, 4gb RAM, 3x6TB WD Red drives (2 data, 1 parity), and 1x480Gb SanDisk Ultra II SSD. Yet activities on the NAS are just painfully slow. I am certainly hoping it is my setup, because quite honestly, it is useless as is. Here are a few examples: 1) Adding 126 RAW photos (each ~45mb) from my SD card into Lightroom, with a folder stored 1 of the data disks, took ~1 hour. Those same 126 files being imported to my Mac's internal SSD took ~1 minute. 2) Organizing files in iTunes (following moving my iTunes media files over to the NAS) is still running and only ~1/2 complete after ~14 hours. 3) Extracting RAR files is pretty much impossible. I've tried several 8-10Gb 1080p movies and ~1Gb 720 TV shows, but have had to cancel since they just were not progressing. Any advice on where to start troubleshooting? Thanks in advance, John