Formula to figure cost of electricity to run a server

To run something for all day long for every day of the month, multiply the watts by 0.0612.

A 230-watt computer to run as a server non-stop would cost $14.08 a month

(230 watts x 0.0612 = $14.08).

Here’s the original formula found at: consumersenergy.com (pdf file)

1. Find the wattage rating on the back or bottom of your appliance. If the wattage rating is not given, multiply the amps times the volts to find the watts. (amps x volts = watts).

Example: A microwave rated for 6 amps and 120 volts uses 720 watts of electricity an hour. 6 amps x 120 volts = 720 watts

2. Divide the appliance wattage by 1,000 to convert watts to kilowatts.

720 watts divided by 1,000 = .72 kilowatts

3. Multiply the kilowatts by 8.5 cents (typical cost per kilowatt-hour for electricity).

.72 kilowatts x .085 = .06 (rounded off)

The cost to use a 720-watt microwave oven for one hour is 6 cents.

Enjoyed this blog post?

Join the creatives who receive thoughtful Spudart blog posts via the email newsletter

guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

2 Comments
oldest
newest most voted
Inline Feedbacks
View all comments
unlikelymoose
17 years ago

14 bucks a month to run a server? just for electricity? my goodness that seems awfully high.

0
Would love your thoughts, please comment.x
()
x