6/14 I'm in S Cal and LA Dep of Water and Power charges me approximately
$1.00 per 10KWH (KWH is 1000 watt hours). I don't watch TV at all,
but my computer's on all the time. Suppose I do get a TV,
and suppose I watch TV 4 hours a day and the TV is rated at
300W, how much does that contribute to the cost? -cheap ass
\_ Look for a product called "Kill A Watt", it's about $29.
It's really nice at telling you what's sucking up all the
juices. My old VCR when powered off sucks 15 watts! To
answer your question, your computer is probably consuming
more power than your refrigerator and is likely the biggest
electricity hog. Do you really need it to be on 24x7?
\_ or if you do, think about getting a laptop instead.
\_ I have a kill-a-watt, and have done extensive power surveys
around my home -- highly recommended. A typical PC computer
eats around 100W on continuously. I ran the math and it came
out to about $20 a month at my Ream-you PG&E rates of about
$.24/KWH. A TV's power consumption varies widely, depending
on the picture (and the technology). My 32" CRT TV displaying
white images burns like 40% more power than when displaying a
black screen. The actual wattage rating it shows on the UL labels
on the back has little to do with its real-life power consumption.
Incidentally, flatscreen / LCD tv's and monitors save a ton of
power, almost 50%. -ERic
\_ I might note as well that your power supplier's cost might
not be linear. PG&E rate, for example goes, up a lot depending
on your total monthly consumption, and this varies with where
you live. They give more consideration for folks in areas that
"require" heavy use of air conditioning. So your actual cost
may go up more than just based on the extra consumption, if it
kicks you into a higher rate zone. -ERic
\_ 300W * 4hr/day = 1.2kWh/day
1.2kWh/day * 30day/mo = 36kWh/mo
36kWh/mo * $0.10/kWh = $3.60/mo. |