-
Posts
411 -
Joined
Content Type
Profiles
Forums
Downloads
Store
Gallery
Bug Reports
Documentation
Landing
Posts posted by heffe2001
-
-
Noticed this in my log:
Jul 23 04:50:47 media01 kernel: timekeeping watchdog: Marking clocksource 'tsc' as unstable, because the skew is too large: Jul 23 04:50:47 media01 kernel: 'hpet' wd_now: 6da0260f wd_last: 6d1ee276 mask: ffffffff Jul 23 04:50:47 media01 kernel: 'tsc' cs_now: 292d9f99dfeaf cs_last: 292d98b8a523c mask: ffffffffffffffff Jul 23 04:50:47 media01 kernel: Switched to clocksource hpet
-
Still having issues adding trackers with your docker. Any that require a login give me a data not received error when it attempts to test and add. I was able to add the Sonarr info (I had the port changed to 9118 external in docker since I was attempting to run 2, and that threw off the authentication to Sonarr). The attempted additions aren't showing up in the log files, so not sure which way to turn at this point.
That being said, my previous install from the other thread still works fine.
-
The current source they have on their git compiles to 0.4.3.1, so that's definitely the current version, just not the current version they have pre-compiled and available. I usually grab the source and compile it on one of my systems, guess he snuck an update in there over the past 2 days.. Do you happen to know off hand where your docker stores it's config files, I'd like to keep that stuff on the unraid filesystem.
I just played around a bit with it, and can't connect it to my Sonarr either, gives me a 401 authentication failed. I also get failures on all of the torrent sites that require login. Seems like it might be something in the distro setup that you're using for your docker maybe??
Where did you get the release of 0.4.3.1? That's the version it shows running, where I'm running the latest they show available (precompiled anyway) at 0.4.3.0 on my install.
I compiled from source on my windows box, and the current source does show 0.4.3.1. Upgraded my install (from my original Jackett), and it still allows me to add Torrent Day. Not sure why yours isn't working correctly.
I also went so far as to replace the jackett install in your docker I have running, and the code I compiled won't allow it to be added on your docker install running my code. Could be a dependency or something possibly?
it's directly a build from git (my fork to add frenchtorrentdb). I'll check tonight sorry!
-
Where did you get the release of 0.4.3.1? That's the version it shows running, where I'm running the latest they show available (precompiled anyway) at 0.4.3.0 on my install.
I compiled from source on my windows box, and the current source does show 0.4.3.1. Upgraded my install (from my original Jackett), and it still allows me to add Torrent Day. Not sure why yours isn't working correctly.
I also went so far as to replace the jackett install in your docker I have running, and the code I compiled won't allow it to be added on your docker install running my code. Could be a dependency or something possibly?
-
Used one of the Jackett dockers on the docker repository. My docker commandline (change the paths to your own locations):
docker run -d --name="Jacket" --net="bridge" -e TZ="America/New_York" -p 9117:9117/tcp -v "/mnt/docker-appdata/data/jackett":"/config":rw -v "/mnt/docker-appdata/data/jackett":"/root/.config/Jackett/":rw -v "/mnt/docker-appdata/data/jackett/app":"/app":rw ammmze/jackett
I extracted the contents of the latest Jackett release zip file (from the above git) into the app/ directory (the extracted zip actually has a release directory in it, with all the files under that, move those files into the app/ directory). On my install I had to map the config directory to both the /config and the /root/.config/Jackett/. Those aren't strictly necessary to run, but that way your config files are safe from a removal and reinstall (otherwise it stores the config files IN the container, not on your unraid filesystem).
It's not unraid specific, but I've managed to get it running ok, and it works with my Sonarr without having to run it on my windows boxen..
Now if anybody wants to MAKE an Unraid docker for this, I'll gladly switch over, but this at least works for me at the moment.
-
Anybody know if the ARC-1680IX-24-2G cards work in Unraid? Getting one this weekend.
-
Dropped the # and changed it to a _ without any issues at all. Restarted after the change, assigned it, now the system finds the drive every time. One less thing to worry about lol.
-
This is what I use:
Flash: http://www.newegg.com/Product/Product.aspx?Item=N82E16820226463
USB Header: http://www.newegg.com/Product/Product.aspx?Item=N82E16812200474
That header is the same as the 2nd one listed above, just from Newegg. The 16g Mushkin thumb drives are very small, decently fast, and work fine with Unraid's registration requirements. I have 2 of them, one as a backup in case the one I use for booting ever fails (both are registered for Pro so no worries there). While 16gb is definitely overkill for the boot drive with Unraid, I don't worry too much about downloading new releases through Dynamix, be a while before I fill that thing up..
-
Going to remove the 2tb drives tomorrow, will rename the raidset at that time, and regen parity. Thanks for the tip!
-
You're actually correct, the drive only shows the ARC-1231-VOL when added, there's a # and long number string (guess it treats it like a serial number). Wonder if there's any way to fix that.
Example:
-
I also restart mine rarely, and if I do it's usually either a unraid core update, or hardware update. It's not been a huge deal for the array to not start, as I usually remember it. As far as I can tell currently, the volume identification for that volume is: ARC-1231-VOL, and that's it. I'll check it when I restart for the RC6 update after parity finishes this evening (replaced 4 emptied 2tb's for a new 8tb last night, leaves me with 3 2tb and 1 3tb left to move data from and remove.. I'm going to have a bunch of 2tb paperweights, lol.
-
Ever since I set up a 4tbx2 (8tb) parity array on my Areca (so I can use my Seagate Archive 8tb drives as data drives), I've not been able to autostart the array. It always comes up that it's missing the parity drive (which is the Areca array ARC-1231-VOL), even though it's on the dropdown for you to select. All the other drives on the controller have the correct naming (using the instructions at the start of the thread for that). Hopefully this weekend I'll get all the drives moved over to the Areca (Still have 6 drives on the MV8, although 3 of those will be pulled after the data is moved to my latest 8tb).
I will say the Areca card is FAST, getting 132MB/sec on a parity check at the moment..
I've tried setting the delay after issuing the udevadm trigger, I've tried it up to 60s total, with the same result (I don't think this is the problem though, as even at 5, all the drives except parity are in their proper place in the array config).
Any ideas? It'd also be great if Limetech supported temp and spindown/spinup on these cards, not sure how much trouble that'd be though.
-
... Can be resolved by rebooting.
It's amazing how many problems can be resolved by rebooting => not just in UnRAID, but in computers in general. First thing I generally ask folks when they need help is if they've tried rebooting ... and a fair number of issues simply "go away"
-
Just noticed some really slow transitions & other weirdness in the Docker tab...
First restart after upgrading to rc5 hung after loading the fixes for areca cards that keep the same names for containers for about 5 minutes or so (I had a 20s delay in the go script). emhttp never started, and the server was not reachable, nor were any dockers running. Since the array wasn't mounted, I did a restart from putty (telnet was working), and everything came up as per normal.
When I went into plex to make sure it was running, noticed there was an update on the plex-pass track, so I went to edit my plex install with the current version number (I'm forcing upgrades using the version variable in the docker config). It took about 25-30 seconds for the edit screen to come up, and once I submitted my changes, the screen that normally shows that it's downloading the changed bits for the docker came back and said 0 bytes loaded, and had the done at the bottom. I clicked the log button, and watched the log be populated by the removal and reinstallation of the docker (albeit very slowly). This is all different behavior than I had in RC4 and previous versions..
-
3 dockers added today
nzbmegasearch
dokuwiki
comictagger
Switched over to your nzbmegasearch, seems to work fine.
-
The supplied json file seems to include most of what I'm looking for (at least I've not missed anything because of it yet).
-
I got a sub for the newznab plus to get a newznab ID, entered it on install of this docker, and it makes a huge difference over the public one that's included built in. Anybody having issues may want to give it a go.
Just go to Newznab.com and purchase newznab plus, they will give you an ID number. Change the docker variable regex_url to:
http://www.newznab.com/getregex.php?newznabID=<YOURIDNUM>
I also set my backfill for 90 days, not sure if it helps or not. I'm able to actually search the indexer now and get results, and I'm using it with Couchpotato, Sonaar (nzbdrone), and several other downloaders. You do still get quite a few failures on the decoding, but you're actually able to download what you do scrape...
-
Same on my system, antivirus (but I'm using NOD32). Only way to see the logs on my work system is to disable web protection. I've tried adding both the machine name and IP to the whitelist, but it still won't show.
Do you have some add-on or plugin install in your browser that it is intended to block pop-ups?
Finally found the culprit, it's my antivirus (F-Secure).
No other option than to deactivate it completely to allow log windows to work
Thanks all for the help and sorry as it seems to be unrelated to RC4
Hope this can help others anyway
-
to the first part of your question, it needs a better regex, and there are ones available.
the second part i won't dignify with a response.
Got it up and running on my system, according to the stats it's finding stuff, but as for decoding the names, how would one add one of the better regex addons?
Also, if you set it up initially with a 0 on the backfill variable, how would you go about changing it to say 30 days? I'm guessing that I'd need to wipe the install, and re-install with that in the settings?
-
I'm trying it from my work machine, will try on my home machine and see if that changes anything. It did pull up logs here prior to RC4 though, and I was able to pull up one docker log earlier (I went straight to the docker page after I started array, and brought up a log for duckdns, which showed, then tried pysab, but that nor any other logs show now). Even running the command that the web log viewer uses to populate the screen on a command line hangs indefinitely.
*EDIT* My home machine will open the logs fine. Never had any problem prior to RC4 on my work machine though.
My logs are working ... this happens to all your installed Dockers ?
Yes all Dockers (4), VMs (1) and unraid log fails to show
Logs are working, just the webgui fails to show them
Have you tried a different machine? I have had (since back in the Dynamix days on 5.X) one machine that refuses to show the system or docker logs in the pop up window (it just shows a blank white screen). It doesn't matter what browser I use. On other machines, it displays fine in all the browsers. Perhaps this is related?
-
Didn't help me, even tried under IE and Firefox, same result, white window.
-
Hadn't noticed the main unraid logs showing up the same (white/blank window with waiting for server in the bottom), but can verify now that it's doing just that on any attempt to view logs from the webui. I will say I was able to open ONE log file when I rebooted my system, but any subsequent attempt shows the white window.
Has anyone else lost the ability to see Docker logs? It opens the popup window, but never receives any data. Tried running the command-line version, and didn't get any output either (I've left the popup window up for an hour+ and no data filled).
I'm experiencing the same issue here since the upgrade from RC3 to RC4.
1. Dashboard view, opening the logs of a VM or Docker opens an empty window that never gets filled
2. Dashboard view, click on unraid log opens an empty window as well
3. Docker and VMs views, opening logs has the same result, a white window with no info
Command line does work
/usr/bin/tail -n 42 -f /var/log/syslog 2>&1
/usr/bin/docker logs --tail=350 -f PlexMediaServer 2>&1
...
Tested with Firefox 38 and IE 11
-
Has anyone else lost the ability to see Docker logs? It opens the popup window, but never receives any data. Tried running the command-line version, and didn't get any output either (I've left the popup window up for an hour+ and no data filled).
-
Just a FYI, it seems to be working much better with my headphones installation. Wonder if it was still working on setup when I was trying yesterday..
Should I re-install, or leave it as-is? I'm guessing it won't matter if I leave it as it is now since they don't update their stuff very often anyway..
Centralising unraid for xbmc/kodi and seedboxs
in General Support
Posted
I highly modified a script from torrent-invites for using LFTP and a seedboxes.cc account for my torrent downloads. You'll need LFTP installed on your base unraid install, as well as sshfs-fuse. The original script as supplied would just mirror folders in a specific destination directory on the seedbox with LFTP, but wouldn't download single files (IE: A download without a parent directory), didn't do sftp, and also wasn't multi-segemented. Subsequent releases on that site added database function, but it was windows-specific so that had to be modified to work with linux/unraid. I also added the capability of it determining whether a download is a directory (and use mirror), or a file (and use get) so that it will handle either without problems. After the files are downloaded, I also call a unrarall script that extracts the rar files, and removes the extras (nfo's, samples, and the parent rars), and leaves the files in such a way that Sonarr and couchpotato can work with them.
The Script:
My notes from the thread on the other site:
Aside from the script above, you'll need cksfv, p7zip, unrar, lftp and sshfs-fuse.
These are the versions (and links) to the packages I used. This is all done on the base unraid install, not in a docker. I placed all these files in the addons directory that unraid installs automatically on boot.
cksfv (http://slackonly.com/pub/packages/14.1-x86_64/misc/cksfv/cksfv-1.3.14-x86_64-1_slack.txz)
p7zip (http://taper.alienbase.nl/mirrors/people/alien/slackbuilds/p7zip/pkg64/13.37/p7zip-9.20.1-x86_64-1alien.tgz)
unrar (http://taper.alienbase.nl/mirrors/people/alien/slackbuilds/unrar/pkg64/14.0/unrar-4.2.4-x86_64-1alien.tgz)
lftp (http://ftp.slackware.com/pub/slackware/slackware64-14.1/slackware64/n/lftp-4.4.9-x86_64-1.txz)
sshfs-fuse (http://taper.alienbase.nl/mirrors/people/alien/slackbuilds/sshfs-fuse/pkg64/14.0/sshfs-fuse-2.5-x86_64-1alien.tgz)
Since for torrents I use the blackhole directories in most of my autodownloaders, I also have a script that sends them up hourly (I'm getting ready to make changes to my scheduling to do the watchdir every 5 minutes, and download the resultant torrents every 15-30 minutes), and pulls down the torrents every day at 5am (unless I fire it off manually, which is usually what happens, lol).
If you have any questions, feel free to ask, but if it's about another seedbox host other than seedboxes.cc, not sure how much help I can be. I've gotten excellent speeds with them (I've had some torrents approach 70-80M/s on a download to the box using a gremlin box from them, and rates down to me average around 7.5M/s on my cable internet connection (60Mbps), which can completely saturate my connection.
Watchdir sync script:
I should also mention that I'm using Deluge on the seedboxes account. I also have my watchdir set up with subfolders under each (tv, movies, tv-sonarr, music, books, comics, etc), and have Deluge set up to search each of those directories seperately for new torrent files, and tag the download with a label for the type of download they are. I use the Deluge autoadd plugin to create the labels based on the directory the torrent is uploaded into, then use the label's move feature to move the completed files into a specific directory in the finished directory on the seedbox. You could always have all torrents dumped into one big finished directory, but I'd advise against it, as you'll have sonarr trying to process movies and music, headphones tagging the directory as unprocessed, and other oddities. If they are segregated into their own directories on the server and downloaded into those folders on your unraid box, each application can be mapped to it's own download folder and you'll be happier with the results.
*EDIT* I just realized you don't necessarily need to add those files to unraid's addon install folder if you're using the Nerdtools plugin, with the exception of the cksfv program, as they are included in that plugin.
*2nd EDIT* Forgot to put the DB initialization info here: