Password avoidance

D

Dave

Peter Foldes said:
Ken

35 yrs being an Electrician. The draw when you open a light switch with a
single 100 w bulb will be approximately +- 0.03 kvh at start up where as
the light staying open will draw 0.01 per kvh per every 10 hrs, Now open
and close the switch 10 times per day which will cause 0.30kvh
registration on the meter as compared to a bulb continuously burning for a
24hr period as approximately .025. Same applies to any electrical
apparatus be it a light bulb or the computer plus adding the opening surge
and then the burning ( running) time
But I will try and find the documentation on this Ken and will get over to
you (right now the documentation is in my head and I do not have my CA
book at hand here so as to show a hard copy
Not to seem irrational or argumentative, but 24 years as an electronic tech
along with two side businesses dealing in electronics repair, plus google
showed me that the surge when starting a bulb lasts appx 1/2 cycle, or
1/120th of a second, plus only 10-15% of the energy consumed by an
incandescent bulb is turned into light, the rest is heat. So, if you'd have
to cycle your bulbs off and on an unbelievable amount of times a day and it
would only be equivalent to a few seconds of burn time at most. Fluorescents
are a little more economical, rule of thumb is if you are going to turn one
back on within 15 minutes, it's usually cheaper to leave it on, unless it's
in the high-usage part of the day when many utilities charge more for KWH of
consumption, then the rule of thumb is 5 minutes. Another offset in the
flourescent is the fact the bulbs, transformers and fixtures are quite a bit
more expensive, so shortening their life can account for a bit more monetary
loss by cycling. Information I found concerning computers was pretty much
the same, if it's going to be idle for more than 5 minutes, put it to sleep
or 10 minutes then turn it off and you will save energy.

http://www.energysavers.gov/your_home/lighting_daylighting/index.cfm/mytopic=12280
http://www.scientificamerican.com/article.cfm?id=turn-fluorescent-lights-off-when-you-leave-room
http://green.yahoo.com/blog/the_conscious_consumer/61/when-to-switch-off-your-lights.html
Original google was: turn lights off or leave on

To be fair, I did find articles to support your claim, but the number was
probably 10 to 1 against leaving them on. One article stated turning an
incandescent light on used as much power as leaving it on for 5 minutes.
This may sound reasonable, but I turned a cold bulb on and in less than a
minute it was too hot to touch so I doubt this data. Most articles, for and
against, agreed cycling electrical devices cuts down of their life, but the
savings in electricity outweigh the loss in life (which is minimal at best).
Finally, this subject came up when I was in college (for electronics) and
worked with a guy who was pursuing his Master Electrician's License. He
stated pretty much the same as you, so we worked it out in lab one day.
Don't have my references, but the data I remember is the same as the
articles I found this time, the surge duration is so short it takes a lot of
cycles to add up to a second of burn time.
Just my 2cents,
Dave
 
D

Dave

Seth said:
Actually now that you mention it, KWH was what I was thinking of. Hands
faster than the mid sometimes.

I saw KWH instead of KVH and automatically changed the "watt" to "volt" to
match the acronym. Oops.

But in looking up to see if there is an actual official listing for KVH,
I'm not seeing one (at least not one that has anything to do with
electricity). Did Peter mean to say KWH instead of KVH?
Either one can be converted into the other, but basically they are both a
measure of power, or watts (energy consumed). The energy consumed by devices
in your home are a combination of the voltage x the amperage, or current. In
short, it takes both factors to make the dial spin on your meter.
AFAIK, KWH indicates the number of Kilo (thousands) of Watts (power) you use
each Hour (time). KVH is an indicator of the same thing, but it takes a
formula to equate it to KWH. I think it takes less in KVH to equal the same
amount of power as indicated by KWH.
One of the electricians who participate here probably have a better handle
on this if I'm not including something or just plain wrong.
Dave
 
P

Peter Foldes

Dave

You do know the difference between a household 100v or a 347v commercial in the
amount of their usage
 
D

Dave

Peter Foldes said:
Dave

You do know the difference between a household 100v or a 347v commercial
in the amount of their usage
I can't give you a scientific example of the differences between those two.
Off the top of my head I think you might be comparing single phase to a
multiple, like delta or y. I can say the usage, no matter what the supply,
is going to be dependant on the demand.
One final attempt at trying to not look stupid is that for the same device,
if it's capable of handling the higher voltage, usually the higher the
voltage, the lower the current so the lower the cost to use.
If that isn't what you're asking then you'll have to be more specific as
you're obviously in your area of expertise and out of mine.
Dave
 
C

Char Jackson

I can't give you a scientific example of the differences between those two.
Off the top of my head I think you might be comparing single phase to a
multiple, like delta or y. I can say the usage, no matter what the supply,
is going to be dependant on the demand.
One final attempt at trying to not look stupid is that for the same device,
if it's capable of handling the higher voltage, usually the higher the
voltage, the lower the current so the lower the cost to use.
No, the cost will be the same regardless of the voltage. Usage is
measured in Watts. Watts are the product of voltage times current, so
if voltage goes up by a certain factor then current comes down by the
same factor. In the end, the Watts are the same, therefore the usage
and the cost are the same.
 
C

Char Jackson

Either one can be converted into the other,
Sorry, I have to disagree. Voltage cannot be "converted" into usage.
but basically they are both a
measure of power, or watts (energy consumed). The energy consumed by devices
in your home are a combination of the voltage x the amperage, or current. In
short, it takes both factors to make the dial spin on your meter.
AFAIK, KWH indicates the number of Kilo (thousands) of Watts (power) you use
each Hour (time). KVH is an indicator of the same thing, but it takes a
formula to equate it to KWH. I think it takes less in KVH to equal the same
amount of power as indicated by KWH.
One of the electricians who participate here probably have a better handle
on this if I'm not including something or just plain wrong.
It's "just plain wrong". :)
 
C

Char Jackson

Dave

You do know the difference between a household 100v or a 347v commercial in the
amount of their usage
You do know the difference between voltage and current, right? Voltage
is not a measure of usage. Voltage is the quasi-constant that is
multiplied with the current to get Watts. Watts is the measure of
usage.

This discussion would have been a lot shorter if you had simply
admitted that you meant to type KWH instead of KVH. ;-)

Also, I have to admit I knew the answer when I initially asked about
"kvh" and what it referred to.
 
T

TOM

Dave said:
Not to seem irrational or argumentative, but 24 years as an electronic
tech along with two side businesses dealing in electronics repair, plus
google showed me that the surge when starting a bulb lasts appx 1/2
cycle, or 1/120th of a second, plus only 10-15% of the energy consumed
by an incandescent bulb is turned into light, the rest is heat. So, if
you'd have to cycle your bulbs off and on an unbelievable amount of
times a day and it would only be equivalent to a few seconds of burn
time at most. Fluorescents are a little more economical, rule of thumb
is if you are going to turn one back on within 15 minutes, it's usually
cheaper to leave it on, unless it's in the high-usage part of the day
when many utilities charge more for KWH of consumption, then the rule of
thumb is 5 minutes. Another offset in the flourescent is the fact the
bulbs, transformers and fixtures are quite a bit more expensive, so
shortening their life can account for a bit more monetary loss by
cycling. Information I found concerning computers was pretty much the
same, if it's going to be idle for more than 5 minutes, put it to sleep
or 10 minutes then turn it off and you will save energy.

http://www.energysavers.gov/your_home/lighting_daylighting/index.cfm/mytopic=12280

http://www.scientificamerican.com/article.cfm?id=turn-fluorescent-lights-off-when-you-leave-room

http://green.yahoo.com/blog/the_conscious_consumer/61/when-to-switch-off-your-lights.html

Original google was: turn lights off or leave on

To be fair, I did find articles to support your claim, but the number
was probably 10 to 1 against leaving them on. One article stated turning
an incandescent light on used as much power as leaving it on for 5
minutes. This may sound reasonable, but I turned a cold bulb on and in
less than a minute it was too hot to touch so I doubt this data. Most
articles, for and against, agreed cycling electrical devices cuts down
of their life, but the savings in electricity outweigh the loss in life
(which is minimal at best).
Finally, this subject came up when I was in college (for electronics)
and worked with a guy who was pursuing his Master Electrician's License.
He stated pretty much the same as you, so we worked it out in lab one
day. Don't have my references, but the data I remember is the same as
the articles I found this time, the surge duration is so short it takes
a lot of cycles to add up to a second of burn time.
Just my 2cents,
Dave
I got to thinking about the lamps used on theater marques. I think they
last as long as the do because the switching cycle is short enough the
filament doesn't have to go through the expansion/contraction of a
household-type on/of cycle.

Probably wrong, but it sounds sort of logical to me... :>))
 
T

TOM

Dave said:
I can't give you a scientific example of the differences between those
two. Off the top of my head I think you might be comparing single phase
to a multiple, like delta or y. I can say the usage, no matter what the
supply, is going to be dependant on the demand.
One final attempt at trying to not look stupid is that for the same
device, if it's capable of handling the higher voltage, usually the
higher the voltage, the lower the current so the lower the cost to use.
If that isn't what you're asking then you'll have to be more specific as
you're obviously in your area of expertise and out of mine.
Dave
A fellow I worked with always ordered 125 or 130-Volt bulbs (don't
remember which). He said they lasted a lot longer than standard-voltage
bulbs...
 
T

TOM

Char said:
No, the cost will be the same regardless of the voltage. Usage is
measured in Watts. Watts are the product of voltage times current, so
if voltage goes up by a certain factor then current comes down by the
same factor. In the end, the Watts are the same, therefore the usage
and the cost are the same.
In my line of work, telephone and e-mail technical support for access
control systems and home security systems, I'm often asked questionsl
like, "I have a 12 Volt 2 Amp transformer. Will it damage my alarm
system that draws 100 milliamp?"

I usually explain current draw by using the river analogy: A river may
be 100 feet wide and 15 feet deep, but if you want a drink of water, you
don't drink the whole river, you just fill a glass. The amount of water
(measured in cubic feet per second (CFS)) would be the current (amps)
while the rate of flow would be the voltage. The appliance will draw the
current it needs, but the voltage has to be compatible. Like the
difference between drinking from a garden hose or a fire hose... :>))
 
T

TOM

Char said:
You do know the difference between voltage and current, right? Voltage
is not a measure of usage. Voltage is the quasi-constant that is
multiplied with the current to get Watts. Watts is the measure of
usage.

This discussion would have been a lot shorter if you had simply
admitted that you meant to type KWH instead of KVH. ;-)

Also, I have to admit I knew the answer when I initially asked about
"kvh" and what it referred to.
The measure for speed, at least when I'm involved is "Furlongs per
Fortnight".

I guess that one is off-off-topic... :>))
 
D

Dave

Char Jackson said:
No, the cost will be the same regardless of the voltage. Usage is
measured in Watts. Watts are the product of voltage times current, so
if voltage goes up by a certain factor then current comes down by the
same factor. In the end, the Watts are the same, therefore the usage
and the cost are the same.
I agree with you as far a theory goes, P=I*E. But in application, if you
have a dryer for instance. Hook it to 120V and it might use 15 Amps, hook it
to 240V and it will probably be less than half the Amperage. So, the total
power used will be less for the 240V than for the 120V.
 
D

Dave

Char Jackson said:
You do know the difference between voltage and current, right? Voltage
is not a measure of usage. Voltage is the quasi-constant that is
multiplied with the current to get Watts. Watts is the measure of
usage.

This discussion would have been a lot shorter if you had simply
admitted that you meant to type KWH instead of KVH. ;-)

Also, I have to admit I knew the answer when I initially asked about
"kvh" and what it referred to.
Actually, I don't remember writing KVH. Even if I did, and you know what you
claim, then you know you're calling 12 inches a foot. Both are a measure of
power and, as I stated before, KVH can be converted to KWH using a formula.
 
C

Char Jackson

I agree with you as far a theory goes, P=I*E. But in application, if you
have a dryer for instance. Hook it to 120V and it might use 15 Amps, hook it
to 240V and it will probably be less than half the Amperage. So, the total
power used will be less for the 240V than for the 120V.
Come on, you know that's not true. :)
 
C

Char Jackson

Actually, I don't remember writing KVH.
Peter Foldes did.
Even if I did, and you know what you
claim, then you know you're calling 12 inches a foot. Both are a measure of
power and, as I stated before, KVH can be converted to KWH using a formula.
There's no such thing as KVH, and you can't convert KVH to KWH for two
reasons: 1)there's no such thing as KVH, and 2)there's no formula.
 
S

Sunny Bard

Actually, I don't remember writing KVH. Even if I did, and you know what
you claim, then you know you're calling 12 inches a foot. Both are a
measure of power and, as I stated before, KVH can be converted to KWH
using a formula.
Power is not the same as energy, also you can't convert kVh to kWh
without actually measuring the current!
 
D

Dave

Char Jackson said:
Come on, you know that's not true. :)
Actually, it is and IIRC it has to do with wires ability to carry power more
efficiently at higher voltages. As I understand it, that's also one of the
reasons utilities transmit power at kilovolts instead of 120 volts.
But, since I can't quote any statistics and am too lazy to google any more,
I'm going to say you win. I think the difference is minor anyway so let's
revert to theory and say you're right.
Dave
 
D

Dave

Char Jackson said:
Peter Foldes did.


There's no such thing as KVH, and you can't convert KVH to KWH for two
reasons: 1)there's no such thing as KVH, and 2)there's no formula.
My bad, you are correct. The acronym I should have use is KVA
(KiloVoltAmps). IIRC KWH is true power and KVA is apparent power. I don't
know that much about the difference, I know I used to rent generators rated
in KWH and now they are in KVA and there is a slight difference in the
output ability. 2) There is a formula for this one. :-D
Dave
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top