This sounds excessive, that’s almost 1.1$/day, amounting to more than 2kWh/24hrs, ie ~80W/hr? You will need to invest in a TDP friendly build. I’m running a AMD APU (known for shitty idle consumption) with Raid 5 and still hover less than 40W/h.
This isn’t speculation on my part, I measured the consumption with a Kill-a-watt. It’s an 11 year old PC with 4 hard drives and multiple fans because it’s in a hot environment and hard drive usage is significant because it’s running security camera software in a virtual machine. Host OS is Linux MInt. It averages right around 110w. I’m fully aware that’s very high relative to something purpose built.
You will need to invest in a TDP friendly build
Right, and spend even more money.
I think the main culprit is CPU/MB, so that’s the only thing needed a replacement. Many cheap alternatives (less than 200$) that can half the consumption and would pay itself in a year of usage easily. There is a Google doc floating around listing all the efficient CPUs and their TDPs. Just a suggestion, I’m pretty sure after a year it would payoff its price, there is absolutely no need for a 110w/h unless you’re running LLMs on that and even then it shouldn’t be that high.