ibixat

Members
  • Posts

    89
  • Joined

  • Last visited

Everything posted by ibixat

  1. I have two servers that each have 2 nic's and I was wondering if there was a way to connect the second nic of each machine to each other to allow a direct link between the two without impacting the bandwidth available for use on the other 2 network cards, allowing them to transfer files between each other without the router being involved. Is this a terrible idea or maybe something that could be useful etc?
  2. Spoke with the woman/office selling them. Turns out they were US Navy surplus, she has 10 left and is holding one for me to pickup Friday
  3. Tampa Florida, original listing said they had 19, didn't ask how many remain
  4. I'm looking for a rack for my setup and found this available locally, pretty sure I can fit it in my suv if I leave the back open and tie it down, or borrow a friends Truck and move it that way... Either way they want 175 for this, I'm kind of shocked at the price. Open box (never used)***For Sale*** Equipto Data Center Server Racks/Enclosures * Color -Blue *OEM air flow requirement approved *Adjustable Mounting Racks *Two Latching Doors per unit *Does not include caster wheels *Number available (19) Includes: (3 per unit) Eaton Powerware Epdu 2.88kVA Power Distribution Units (120V, 24A, 50/60Hz) -- Now given that I'm looking for a rack, I"d be clinically insane to pass this up yeah?
  5. 30 TB, Going to be throwing a little more storage in there next week when the new machine arrives, upgrading from my current hodgepodge mess in a norco 4224 to a supermicro 24 bay, the case upgrade is mostly just quality level the CPU going from a Athlon II x4 965 to a pair of Xeon 2630 v2's and from 16bg to 32 gb of ram is going to be best part, that and having 34 sata ports available for drives, current machine only has 16 at the moment, more bays than i can hookup. *edit* new machine is going to see the 2tb Cache join the array, as well as an additional 2tb, 1tb 500gb drive I have laying around (currently preclearing on a spare machine) and I'm considering tossing in a few 2.5inch drives I have as well, will be using 8 256gb ssd's to replace cache drive.
  6. I believe that's called versatility 😃 It can do a lot with a little bit of hardware, it can do a whole hell of a lot with some power backing it up.
  7. So coming up here very Soon I'm "retiring" my original unraid server that has been running off and on for 8 years now. It evolved a lot over the years but it's replacement is going to be far more capable and robust, even if it is 5 year old processors. The question I have to pose for everyone is this, right now there is nothing on the array that is vital, just tv and movies for my plex and I am not running with parity as I really don't care too much if I lose them, I plan to migrate the drives to a new machine and think it's time to wipe out the flash drive that's in there and let it start over. There are files and debris from having various versions on there over time and I think I'd like to just start it with a fresh install, no unmenu files lurking about, no random leftovers from installed packages I messed around with 5 years ago etc. Since I'm not running parity and the drives are all good active drives at the moment am I able to just start a new array in the new machine, add those data disks to an array and fire it up and go, recreating my shares and what not based on their contents? The new machine will have a 24 port backplane and 10 internal sata so I'll have a lot more options than my currently 16 drive capped machine (it's in a norco 4224 but I need more sas ports to expand further) so I'll most likely be adding back a parity drive, maybe running my cache drive as a pool of ssd's etc.. I just really am curious if I can just pop the drives into a new flash and let her rip.. I have 2 licenses so i can setup a second flash drive and leave the original in tact just in case.
  8. I’ve left watch sensors running in a terminal and the web interface open so that they essentially freeze whenever the system shuts down and they’re only reporting CPU temperature of around 48c, unless it’s spiking to 90 then rapidly dropping after shutdown starts and before tasks stop running Any load that I throw at the server any other time doesn’t seem to bring the CPU above 50 to 55° . I wonder if it’s somehow plex doing library stuff that somehow kills it. I’ve never observed firsthand a temp over 60 leaving watch sensors up the entire day when I’m awake or at home. This always seems to happen overnight.
  9. My system had been shutting down on it so own and at first I thought it was the cpu overheating and shutting down but it actually appears to be due to some other thermal sensor. I have finally caught the error and it’s telling me kernel:thermal thermal_zone0 critical temp reached 91 shutting down. Prior it was at 90 the last crash. Either way the cpu was only reporting around 48c at the time of the crash. My mother board is an old am3 biostar TA870U3+. Any thoughts on what is overheating so badly that it’s causing this shutdown.
  10. I know this post is pretty old but I was curious what peoples opinions were on using just 4 80mm Noctua fans instead of a new backplane and 3 120mm. How much diference do you think would be noticed in a room that usually has a small tower fan running and ceiling fan.
  11. well that's what I did, was stop docker, change the size, restart. That's when things hit the fan, I think it was out of space for some reason and that's what corrupted the containers, I've been removing them and re-adding to get them working using the templates and since everything was stored elsewhere on the array it seems to be working ok so far.
  12. I know this is an old post but it's the method I tried to use to enlarge my docker image, when I did this I had several dockers that vanished, and when I leave basic view I show a bunch of abandoned containers. Any idea what would be the cause of this happening?
  13. That's a great question. I had assumed it was built in. I already gave it my comic vine api as it's basically useless without it. I enabled metadata tagging but I'm not sure how to know if it's working or not. When I go to the page listing what I have there is a button to tag it. Pressing it says successfully tagged. That said I'm going to install comictagger on my mac and play with it a bit. Not sure if it makes sense to have it installed in this docker or if I should create a new one. But I'm going to do some research on it. Sounds like if it was included you would know well ComicTagger has 2 ways it can run, GUI and command line, since Mylar uses comicvine to identify the books when you use comic tagger to give them their metadata it isn't doing the normal searches you'd be doing in the gui it's taking the issue ID and grabbing that data and tagging it. It helps keep comic rack and others really nicely organized. To get mylar to tag them though it needs the command line comicTagger installed, I've considered just loading into the docker and installing it if it will let me and testing it out, worst case I just wipe it and reload the docker. Someone does already have a docker for the GUI version runs in a browser window, does a fantastic job, command line version just helps with that automation that we all love.
  14. Glad to help, I think it's actually any browser that uses the webkit enginge. Safari and Chrome won't work, edge doesn't work, firefox and IE do.. for me at least and the people I've talked to.
  15. I had that problem with the latest CP and chrome. The check boxes won't work but they work in IE.
  16. ok I was confused then by what you wanted. For all my NZB/Torrent and related apps I have a mounted scratch disk, not the cache drive, that I share to all the apps the same way, so that it has the same paths in all the apps, I found that sometimes the integration between apps can break down if two apps think the file path is different. So I just keep my watched folder and my downloads folders all in the same paths for each of them. The final destinations are not always the same structure but at least from the acquiring and processing side they all match up.
  17. For Transmission the black hole directory would be it's watched directory if you mean for it to pick up torrents from other apps, or if it's dumping files that are complete into another program it would be that programs watched directory. For example I have Mylar set to drop NZB's and Torrents into /mnt/disks/Apps/Watched, and I have both sab and transmission set to watch that folder for things to process.
  18. If you're just using watched directories you may run into issues doing it that way, especially if it's moving files around. If you have sonarr set to control Deluge directly I think it can remove the file from the list once it's completed and imported. It does for sabnzbd, it's an advanced setting on the Download clients page, "completed download Handling, and the Remove option, I have it set to yes but I only use nzb's
  19. Pinion I didn't see it in the notes or anywhere but do you have comictagger built into the docker or any tips on how to get it setup in the docker. I've been running it on my windows machine but currently my network is not wired due to the house we rent, I'm using a few AC bridges to keep things all connected but it does have an impact on performance compared to running ethernet, basically I can do it the way I am now but it's slow and well, a manual process. Comictagger really is slick and the integration to mylar is one of its' strongest aspects I think.
  20. Oh yeah my temporary (not a real) solution is to just blackhole the NZB's to sab'd watched directory, I'm still setting everything up so post processing is not done at this time so it's not a super big deal for me yet
  21. Interestingly, you sent me down a path with your post so i tried the return_host setting in the ini, but now stab just reports empty nabs all the time which is strange... pinion has contacted me and said he will be testing ASAP This happened to me the one time I was able to get it to actually get an NZB, it was emtpy, I tried a lot of cominations of settings and couldn't nail it down, if you guys need any logs from my machine or anything you can email me, I have an email that is <my forum name> @ <my forum name>.com
  22. I think I'm running into this issue now as well, the mylar forums seemed to tell you to fix the return host in the config.ini, if I don't do that everything works but sab can't retrieve the nzb, if I set hte return host I start getting errors when I try to have it send it over. I can provide some logs of errors and what change I made before each in the config.ini if you'd like.
  23. Thanks for the ideas guys, I'm going to look into them and see what would be easiest to implement that the parents can figure out as well. As far as the internet connection I have a 35mbit up and down connection and parents do as well, both on Fios so transfers should go fairly quickly.
  24. I'm looking for ideas on how to setup a way for my parents to download files off my unraid server remotely. They live in Florida and I'm in Pennsylvania, I really don't feel like mailing them packages all the time to send pictures and videos to them. Does anyone have a thought on the best way to set it up so they can get either FTP or some other secure access to the shares etc. I have a 35mbit upload cap and would like to be able to set it up so they can just log in and get to the new stuff without too much hassle. Any suggestions or ideas on how to do this would be great, I know there are posts about FTP and security and other considerations but I couldn't find anything that sounded like what I'm trying to do.
  25. I messed with it while watching random tv shows off my server last night and I gotta say, CREEPY... I thought soundhound/shazaam et al were voodoo black magic already, this just goes one step beyond... It identified a TV show from like 10 seconds of background noise and then like 2 lines of dialog