Wondering if anyone has a feel for the power efficiency of older server hardware. I’m reading posts from people who say they have R710 with lots of hard drive and it IDLES at 160W with 8 hard drives. So…if you take the hard drives out of the equation, it’s probably still like 120W. Is that just how inefficient old computers are? Kinda like incandescent bulbs are less efficient than LED bulbs? How efficient is the R730 compared to the R710?
My 6 year old desktop computer is 60W idle with a GPU, and 30W idle without the GPU. Seems like a huge difference. It’s like $70 more per year to run a R710 than my old desktop with a GPU. Is that correct?
I have an r930.
With 4x e7-8890 v4’s. 9ssd’s 2x SAS drives 4x m.2 drives on a pcie card 512gb of ram across 32 ram sticks
Pulls about 400w like all the time even at idle.
Holy cow. What’s driving half of that wattage? Is it the 32 sticks of ram? Or the 4 cpu?
Your server is 75% of my entire house power, including my server.
I forgot there is also a gtx 1650 in there as well.
But honestly. I’m fairly sure the majority of the power draw is the 4 CPU’s.
96 cores and 192 threads on older architectures was a bit of a power suck. If I had it all to do over again I would for sure have gotten an epyc chip instead.
Trying to wrangle in my R730XD now, have it fully populated with 12 HDDs. Did a downgrade to a more efficient CPU but it didn’t help much at idle. Replacing the 750w PSU for the 495W one did.
CPU does help during load though… running 2x E5-2630Ls.
When the disks spin down it idles a lot less so I just ordered 2 SSDs for the flex bay to migrate noisy applications over. It’s mostly a Plex/file server
Out of curiosity, what’s your desktop setup? Particularly motherboard, CPU, RAM, PSU model and capacity? 30W idle without a GPU is exceptionally good for a 6 year old PC.
R710 (or anything of that era, really) = fan heater / air con ballast that happens to be able to compute as a side-effect.