06-08-2010, 04:42 PM
|
|
Beer Money Baron
Industry Role:
Join Date: Jan 2001
Location: brujah / gmail
Posts: 22,157
|
Quote:
A computer whose label or power supply says 300 watts might only use about 70 watts when it's actually running, and only 100 even in peak times with serious number-crunching and all the drives spinning.
|
Quote:
For example, let's say you have a big high-end computer with a gaming-level graphics card and an old CRT monitor, and you leave them on 24/7. That's about 330 watts x 24 hours x 365 days/yr = 2,890,800 watt-hours, or 2891 kilowatt-hours. If you're paying $0.14 per kWh, you're paying $405 a year to run your computer.
Let's try a different example: You have a computer that's less of an energy hog, like in iMac G5 20", which uses about 105 watts, and you're smart enough to turn it off when you're not using it. You use it for two hours a day, five days a week. That's ten hours a week, or 520 hours a year. So your 105 watts times 520 hours = 54,600 watt-hours. Divide by 1000 and you have 55 kilowatt-hours (kWh). If you're paying 10¢ per kilowatt-hour, then you're paying $5.50 a year to run your computer.
|
Eh, not so much. He doesn't need a monitor or high level graphics card.
|
|
|