Jump to content

Jessie

Members
  • Posts

    268
  • Joined

  • Last visited

Everything posted by Jessie

  1. Probably a bit late now but here it is:- http://computeronline.com.au/products_pic.php?C_ID=24&S_ID=241&PROD=45133 It's a SilverStone CFP53. It bolts straight onto the drive bays of the Coolermaster case above using hard drive screws.
  2. Is anyone aware of a docker that will function as a Linux based exchange server?
  3. Thanks for the reply itimpi. I was a bit excited about openelec initially, but I expect that the unraid server won't be in the same room as the TV. In the past I have just used media players such as Noontec which will play the movie straight off the hard drive over the network. I assume the smart TV's will require something like Plex to play media over the network. Maybe they can play the file directly. Not sure. What I want to achieve is a cost effective unraid machine which will primarily store data. If it is sitting in a home office, it could have a windows 10 VM in it and be used as a workstation, whilst performing other tasks in the background. It would be important that the operation of the machine be as reliable and uncomplicated as possible. The end user will be a person who expects to be able to turn on the machine and use it. Troubleshooting if possible should be negligible. From a NAS point of view, I have a machine with 14 drives running v4.7 which has been running for years. With the exception of a couple of failed drives I've never had to go near it. So far the VM's which are mounted on the SSD's on the new machine are reliable and very fast, so I am quite excited about the possibilities using this platform. The machine that I have built for myself is currently running 2 vm's, with windows 10 and a linux mint VM. All are being accessed through remote desktop. The remote desktop to the Mint VM is a bit unstable atm. (It crashes) I haven't had much success with the graphics card yet. When I assign eg a windows 10 vm to it, it won't start. The system also reports the graphics card as a radeon HD7450 rather than a R5 230. Not sure how to approach this yet. I haven't done anything with Docker yet, but anticipate that it will be one of the prime functions of the server once I get my head around it.
  4. I was going to throw this question to Support, but I thought I'd drop it here first. Can anyone help? I have used unraid for a number of years. So comfortable with NAS side of unraid. I am trying to come to terms with VM's and docker and Graphics card selection. I have built up a new server close to the specs of the lime tech appliance. ie Supermicro board with 14 ports. 2 x Icydock 5 into 3 hotswap bays and a 4 into 1 icydock hotswap for the SSD's in an Antec 900 3 case. I cannot source the same Graphics cards. So have tried a ASUS Geforce Gt610 which I couldn't get to work. It currently has a ASUS Radeon R5 230, which I also can't get to work. I am a system builder. I intend to build Unraid systems for my clients with the primary purpose of storage. If I understand the concept properly, the machine might serve as a workstation using a virtual Windows 10, while serving movies to smart TV's in other locations in the premises. Can this be done. I am not sure of the purpose of pass through. Is graphics pass through only applicable to devices directly plugged into the local video card or does it somehow provide acceleration to network connected devices? I don't intend these machines to be gaming machines but would like them to be able to serve data to smart tv sets. Therefore, I don't want to spend big dollars on Graphic cards if possible. GTX cards in Australia start around the $200 mark. My goal would be a device around the $100 or less.
  5. http://www.coolermaster.com/case/case-accessories/haf-stacker-storage-kit/ I just checked the supplier I got the cradle from. They don't have it any more. I'm not sure it was a coolermaster cradle but it might have been. The picture shows the existing case disk holder showing the mounting holes. The cradle just screwed straight on using standard case screws. It actually holds 3 drives but I only had ports for 2. Hope this helps somehow.
  6. The i3 is still on 4.7. It wasn't broken, so I didn't fix it. I will upgrade it soon. It is used purely for storage and is still going strong. I am still getting my head around the new v6 server. It will take over some of the duties of the i3 and the i3 will become purely a backup storage device.
  7. I wanted to build a new server the same spec as the AVS-10-4 Limetec box. Unfortunately the Aus dollar is going to hell in a hand basket at the moment and the cost of a case was out of reach after the exchange rate and freight are taken into account. So I decided to build the machine in an Antec 900 2 case. It includes:- 2 x 5 into 3 icybox hotswap bays 1 x 4 into 1 icybox hotswap bay for the SSD drives. Supermicro X10SL7-f motherboard Xeon E3-1231 v3 processor 32gb ecc ram 650W Coolmaster GM power supply which has enough connectors to supply all addons. ASUS Radeon R5 230 graphics with 2gb on board. Currently there are 3 x 3tb Wd green drives in it and an intel 240gb SSD drive. I'm still experimenting setting up VM's and Docker apps on it at the moment. Not sure whether I got it right with the ASUS card. The system reports it as a Radeon HD7450 and it crashes and burns if specified in a windows 10 VM or a Linux Mint VM. Any hints on this? Or do I have to bite the bullet and put a GTX card in it to get it going?
  8. Just a follow up on this machine. It has been running 24/7 for over 4 years now. There was a bit of doubt about the use of Gigabyte MB/s at the time and the network controller. This has proved to be not an issue and the machine is still under heavy use. The only mods done since was the addition of a drive cage to hold an extra 2 drives. I had 2 spare ports on the Sata controller so decided to utilise them. The cage bolted to the existing internal cage without any needed modifications. It is still running Version 4.7 You might notice that the small fans that were on the wingsonic hotswap bays are now missing. Those little fans never seem to last, so I removed them. The existing cooling in the case is adequate to keep things running without them.
  9. I was asked to crosspost this from another section because the motherboard uses a RTL8111e Lan controller. "My New Unraid Box Gigabyte H67MA-USB-B3 Motherboard Intel i3 2100 processor (Socket 1155) 4GB Corsair XMS3 DDZR3 ram 2 x Wingsonic 3 into 2 bay Sata 2 enclosures 2 x Adaptec 1430sa Sata2 PCIe controllers 8 x Western Digital WD20EARX 2tb Drives (Room for 12 drives) Coolermaster RC-692 advanced case Coolermaster GX550 power supply Sandisk cruser flash drive 4gb" The answer to the question is no. I am not having ANY issues with the lan controller or the machine as a whole so far. I've been running it for about a month. I did a test and sent 119gb of data to the cache drive, and the average speed was 59mb/s peaking to around 65mb/s, taking a little over 1/2 hour to complete. No problems encountered. The syslog reports a RTL8168b/8111b, so not sure whether that is relevant or not.
  10. Model / Serial No. Temperature parity WDC_WD20EARX-00P_WD-WCAZA6667327 29°C disk1 WDC_WD20EARX-00P_WD-WCAZA6667406 27°C disk2 WDC_WD20EARX-00P_WD-WCAZA6646413 28°C disk3 WDC_WD20EARX-00P_WD-WCAZA6659947 * (off) disk4 WDC_WD20EARX-00P_WD-WCAZA6657777 27°C disk5 WDC_WD20EARX-00P_WD-WCAZA9374517 24°C disk6 WDC_WD20EARX-00P_WD-WCAZA9371667 24°C cache WDC_WD20EARX-00PASB0_WD-WMAZA5812381 26°C I haven't done a power check yet. Will post when I do. I test ran it for a month to make sure no motherboard compatibility issues etc and added the sata controllers and hot swap bays yesterday. The temp readings above are in the middle of receiving backups from a small business server and a workstation as well as 3 other parallel folder transfers from another workstation. So I think those temps will be typical. (Disk 5 and 6 are in the hot swap bays) I took note of the comment about the RTL8111E lan controller and ran a test. I transferred 119gb of data to the cache drive and the average throughput was 59mb/s, peaking to 65mb/s. It took a little over 30mins. I haven't found any issues with the Lan controller so far. The syslog reports the controller as a RTL8168b/8111b Not sure whether that is relevant or not. I am wondering whether power supplies might be a factor in the rtl8111e issues. ie if the power supply is not up to scratch the operating voltage might be a bit down.
  11. Apologies. I may have posted this twice. Hopefully this one has the pictures My New Unraid Box Gigabyte H67MA-USB-B3 Motherboard Intel i3 2100 processor (Socket 1155) 4GB Corsair XMS3 DDZR3 ram 2 x Wingsonic 3 into 2 bay Sata 2 enclosures 2 x Adaptec 1430sa Sata2 PCIe controllers 8 x Western Digital WD20EARX 2tb Drives (Room for 12 drives) Coolermaster RC-692 advanced case Coolermaster GX550 power supply Sandisk cruser flash drive 4gb Comments I chose this motherboard because it was compact, Had 2 x SATA3 connectors, 4 Sata2 connectors and a PCIe8 and a PCIe4 slot. I use the sata3 connectors for the Parity and the cache drives. Currently it runs 8 x 2tb green drives and can hold up to 12 drives. I’ve only just installed the Hotswap bays. It ran for about a month on the 6 internal drives without errors. I have 2 spare ports on the adaptec controllers, so I might increase the capacity to 14 drives if I can find a suitable bracket to hold the drives. I’ll worry about that if I ever need them. I removed the top fan in the case, to reduce noise. (Not needed) The front and back fans produce adequate air flow. The case has rubber mounted brackets for the drives, so it runs very quiet. I like this case because it allows the cabling to enter from the back, which makes the front look quite neat. I mounted a USB header inside the case to get the Flash drive out of the way. I had to file a little bit off the bracket so that it fitted neatly between the backplane screws and used these screws to clamp the header down. I know I don’t need 4gb of ram but that was the smallest amount I could buy. Wasn’t very expensive. Possibly socket 1155 and i3 are also overkill, but I wanted to build the machine using current technology. Even so, the cost per gigabyte of data is much lower and faster compared to an off the shelf NAS device. I haven’t done the power figures etc yet, but I don’t think it will use much. The processor is rated at 65W when flat out and the green drives are low power users.
×
×
  • Create New...