The blog filled with creative thoughts
<< Cubs broadcast from Tribune Tower  |   ARCHIVES  |   New Darth Vader Costume in Episode III>>

# Formula to figure cost of electricity to run a server

To run something for all day long for every day of the month, multiply the watts by 0.0612.
A 230-watt computer to run as a server non-stop would cost \$14.08 a month
(230 watts x 0.0612 = \$14.08).

Here's the original formula found at: consumersenergy.com (pdf file)
1. Find the wattage rating on the back or bottom of your appliance. If the wattage rating is not given, multiply the amps times the volts to find the watts. (amps x volts = watts).
Example: A microwave rated for 6 amps and 120 volts uses 720 watts of electricity an hour. 6 amps x 120 volts = 720 watts

2. Divide the appliance wattage by 1,000 to convert watts to kilowatts.
720 watts divided by 1,000 = .72 kilowatts

3. Multiply the kilowatts by 8.5 cents (typical cost per kilowatt-hour for electricity).
.72 kilowatts x .085 = .06 (rounded off)
The cost to use a 720-watt microwave oven for one hour is 6 cents.

 So the magic number is 0.0612 Find the watts for the device and multiply it by 0.0612 and you get the cost to run the device for one month non-stop. Posted by: spudart on Aug 17, 06 | 9:49 am 14 bucks a month to run a server? just for electricity? my goodness that seems awfully high. Posted by: e Maldre on Aug 17, 06 | 10:41 am
 Hi. I'm Matt Maldre. Every single weekday my blog on spudart.org has a new post with an original idea or discovery. Be sure to stop by daily to see what's happening.