Jump to content
Sign in to follow this  

NAS and Home/Gaming in one computer?

5 posts in this topic Last Reply

Recommended Posts

Hi there,


I have an unRaid Server (HP Gen8) since over a year and an iMac 2012... I'm just thinking if if it would be clever to combine my home computer and my NAS server in just one powerful device. The reason I'm a bit skeptical is the hardware running 24/7. Would it be bad (and power inefficient) if I would use a powerful machine for just being a NAS most of the time and running Windows or Mac (as VM) only a few hours per day, maybe even not every day? So what I'm saying is, should I use two computers (server and home) or shouldn't it matter in terms of power consumption and hardware "wear"?

Share this post

Link to post

It isn't as bad as you might think, using a 1080ti as a baseline it looks like you would be adding something like 15w idle to the load for the card.  If you go with the right processor, which if I'm not mistaken will be an intel one if power is a concern, you could get a very reasonable idle draw...given the age of what you are running now, it could even be slightly more efficient if you went with a 1050 or 1060. 


I'd recommend that you do some baseline numbers, I have a feeling you can build more than you think with a reasonable power draw if you pick the parts properly.  


Some info: https://www.tomshardware.com/reviews/nvidia-geforce-gtx-1080-ti,4972-6.html

Share this post

Link to post

Thanks, didn’t know that a high end gpu only needs that little power when idle. And it’s also not bad (durability-wise) for gpu and cpu running 24/7 in idle?


I was actually thinking about an Ryzen CPU, sounds very interesting right now, but I wasn’t aware of power consumption differences between amd and intel.


I guess the biggest problem will be a “golden” macOS VM ? I will do some research, which parts combined could work, but my biggest concern right now was/is the durability of high end and therefore expensive hardware, when running 24/7 (don’t want to replace a 1080 ti after two years, because it ran in idle all the time)

Share this post

Link to post
Posted (edited)

Most hardware these days has very impressive idle control to lower power usage when not needed...that has the side impact of leaving most of the device unused.  For GPUs it is a little different than a drive, as a drive will go almost totally silent while a GPU will still pull some voltage to be "ready" to instantly deliver and keep the desktop running.  That said, a quality GPU is built as well as a CPU, the fan on the GPU might be at risk if it doesn't power down when the card is idle...but I'd think your odds of GPU failure over time don't go up dramatically....but then again, I've never really looked into it :).


If you want to dig into this, check out the durability reported on GPU by Bitcoin miners.  They are going to be FAR AND AWAY the worst case for a GPU and will have lots of data to back up their numbers.  Just remember, they run 100% 24x7 so if they expect N years out of a card then you can likely expect N*<a lot>.  

Edited by Tybio

Share this post

Link to post
On 5/15/2018 at 8:57 AM, lixe said:

and hardware "wear"?

On the subject of wear, I run my home system 24/7 as I've historically sided on the wear occurs most with cold-starts -- I've done so for years, I haven't noticed any negatives for constant usage. Yes, I do notice a hit on power-bill, what percentage? I haven't done those calculations in years, so I don't have good numbers for you. 

Share this post

Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this