Wondering if anyone has a feel for the power efficiency of older server hardware. I’m reading posts from people who say they have R710 with lots of hard drive and it IDLES at 160W with 8 hard drives. So…if you take the hard drives out of the equation, it’s probably still like 120W. Is that just how inefficient old computers are? Kinda like incandescent bulbs are less efficient than LED bulbs? How efficient is the R730 compared to the R710?

My 6 year old desktop computer is 60W idle with a GPU, and 30W idle without the GPU. Seems like a huge difference. It’s like $70 more per year to run a R710 than my old desktop with a GPU. Is that correct?

  • hodak2B
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    My server runs a LOT of things. Including numerous crypto nodes. So it is not a total loss.

    Plus I do love all the room for activities and use it constantly for making videos on YouTube.