Wondering if anyone has a feel for the power efficiency of older server hardware. I’m reading posts from people who say they have R710 with lots of hard drive and it IDLES at 160W with 8 hard drives. So…if you take the hard drives out of the equation, it’s probably still like 120W. Is that just how inefficient old computers are? Kinda like incandescent bulbs are less efficient than LED bulbs? How efficient is the R730 compared to the R710?
My 6 year old desktop computer is 60W idle with a GPU, and 30W idle without the GPU. Seems like a huge difference. It’s like $70 more per year to run a R710 than my old desktop with a GPU. Is that correct?
Anyway, I’m trying to understand this whole power consumption thing too. See my other post if you’re interested. And my conclusion is that it just doesn’t make sense. There are a lot of factors involved. Nuanced things, like quality and of components, firmware etc. You can’t just say old == bad power consumption, new == good consumption. Your 2012 server is a good example of that.
In my opinion, the best thing to do is just look up the specific model you intend to get and check what people’s experiences are for that particular model. And there is your expected power consumption.