Hello, I have 4 blade servers with a total present power of 5250 watts. Is the formula I should follow to calculate the total monthly KW/h consumption as 5.25 x 24 x 30 OR 5250 x 24 / 1000 x 30?
Assuming they're running at their rated 5250 watts, which they're probably not:
5250 watts x 24 hours = 126kWh per day. PER DAY!
126kWh x 30 days = 3780 kWh a month, which is approximately four times my entire domestic electricity consumption (including my own home lab, operational network, two EVs and heat pumps).
That is preposterous and ruinous financially. You do not want to do that, I promise you.
They almost certainly won't be running at their rated maximum power, but even 20% of that run 24/7 is still a ruinous amount of electrical load. Unless your power is somehow free, you want to think very seriously before deploying that.
We have an old HP C7000 at work. It pulls 423W empty, 560W with a single blade. Safe to say it's a bad idea.
But regarding your question: You already have the thing, so measure it for actual use, as your workloads will most likely vary over time.
Or to calculate: E(kWh) = P(W) × t(hr) / 1000, so 3833.025(kWh) = 5250(W)x730.1
The number of hours per month is an average over a year.
To your question: this may be a lot of fun to play around with, but it will costs hundreds of dollars per month to leave on 24/7. For comparison: that power is more than what it would take to blast your AC all month in the summer.
Use rapid tables, enter in your wattage and hours per day, if you put in your power cost it will even give you a monthly and yearly bill. Just google “power cost calculator”