Hello, I have 4 blade servers with a total present power of 5250 watts. Is the formula I should follow to calculate the total monthly KW/h consumption as 5.25 x 24 x 30 OR 5250 x 24 / 1000 x 30?
Assuming they’re running at their rated 5250 watts, which they’re probably not:
5250 watts x 24 hours = 126kWh per day. PER DAY!
126kWh x 30 days = 3780 kWh a month, which is approximately four times my entire domestic electricity consumption (including my own home lab, operational network, two EVs and heat pumps).
That is preposterous and ruinous financially. You do not want to do that, I promise you.
They almost certainly won’t be running at their rated maximum power, but even 20% of that run 24/7 is still a ruinous amount of electrical load. Unless your power is somehow free, you want to think very seriously before deploying that.
We have an old HP C7000 at work. It pulls 423W empty, 560W with a single blade. Safe to say it’s a bad idea.
But regarding your question: You already have the thing, so measure it for actual use, as your workloads will most likely vary over time. Or to calculate: E(kWh) = P(W) × t(hr) / 1000, so 3833.025(kWh) = 5250(W)x730.1 The number of hours per month is an average over a year.
Looks like both of your formulas are the same?
To your question: this may be a lot of fun to play around with, but it will costs hundreds of dollars per month to leave on 24/7. For comparison: that power is more than what it would take to blast your AC all month in the summer.
4 blades? Jesus those must be old. I thought 800w idle with 3 UCS b200 m3 blades in a UCS mini was bad…
I’m now running 3 dell r630’s with dual 10 core e5 v4, 384gb ddr4 and 8x 1.92tb SSDs and it’s under 400w idle with 12 or so VMs running
Use rapid tables, enter in your wattage and hours per day, if you put in your power cost it will even give you a monthly and yearly bill. Just google “power cost calculator”