Sunday 7 September 2014

AC Power Supply: 50 Hz or 60 Hz?

1. If we use 230v,50hz for a load say 100kw. the current for this supply will be small when compared with 110v,60hz. but the cable size for 110v,60hz is large due to insulation 
size requirments. and most interesting thing is that, only the cold countries will use 110v,6ohz supply because the heat dissipated due to these large current will acceptable to their climatical condition but not for india.. so the supply is depended upon the country climatical condition 

2. Date: Thu Apr 26 16:08:13 2001
Posted By: Dave Lawrence, Staff, Microelectronics, Lawrence Consulting
Area of science: Science History



The short (and not so interesting) answer to your questions is this: the voltage and frequency chosen for commercial and residential power standardization for a geographic region (country, continent, etc.) is very distribution are somewhat arbitrary. But once they're chosen, hand side of the road. The choice doesn't matter very much, but agreeing on important. It's kind of like having everyone drive on the right or left the choice matters a lot!

That said, would any frequency and any voltage be as good as any other? And the answer is no. Let's consider the reasons for this by looking at what's  good and bad about high and low voltage. From there we can zero in on a
voltage range that's low enough to avoid the bad things about being high and high enough to avoid the bad things about being low.  Then we'll do the same thing for frequency. As most everyone knows, really high voltage is dangerous--deadly dangerous! So for safety reasons in a home, you want voltage to be as low as possible.Well then why not make it really low, say a volt or two, like a battery? The reason involves the relationships between electric voltage, current, resistance and power, so let's review them. When current flows through a wire, the wire heats up as electrical energy is turned into heat. This is how electric stoves, toasters, hairdryers etc. work. If enough current flows, the wire can even glow, giving off energy in the form of light. That's how light bulbs work. The rate that electrical energy is being turned into heat and/or light is called power and is given by:

P = I*V

where P (power) is in units of watts, I (current through the wire) is in amps, and V (voltage difference between one end of the wire and the other) is in volts. So a watt is equal to a volt*amp. A 100 watt light bulb could be designed to be lit with 50 volts and 2 amps or 110 volts and 0.9 amps or 220 volts and 0.45 amps, and so on. But they would be different bulbs; a 220 volt bulb would be very dim if connected to a 110 volt outlet.The equation relating the current flowing through a wire (or any object with resistance) to the voltage applied across its ends is called Ohm's
law:

V = I*R
where V and I are the same as above and R is electrical resistance in units of ohms. Ohm's law says that for a wire of given resistance, the more voltage applied across its ends, the more current will flow.Now if you substitute Ohm's law into the equation for power you get:
P = (I^2)*R     (that's, "p equals i squared times r" in plain English).

Having eliminated voltage, we can now see that as current increases in a wire, the power dissipated (that is, the rate that electrical energy is converted to heat and light) goes up very fast, namely as the square of the current.

That's all the physics we need for now; let's see what it's telling us. Suppose in your house there's a 10,000 watt electric stove.  The equation for power, P = I*V, says you'll get just as much heat with high current and low voltage as with low current and proportionally higher voltage. So the stove could be designed for, say, 10 volts and 1000 amps or 1000 volts and 10 amps; it should get just as hot, just as fast either way. But it doesn't; the 1000volt*10amp stove gets a lot hotter a lot faster than the 10volt*1000amp stove. The reason is that the stove isn't the only thing
heating up; the cord from the stove that plugs into the wall is getting hot too, and that uses some of the power. And even a big fat copper (in other words, low resistance) cord will get hot when a lot of current flows through it. That's what the other equation for power, P = (I^2)*R is telling us. Even with a small resistance, high current means lots of heat because the current is getting squared! So to keep from wasting all your power in the cord you want current to be low. But that means voltage has to be high, which is dangerous. And that's how residential voltage standards
were arrived at; 1000 volts is too dangerous, and 10 volts is too inefficient for high-power appliances. The balance was struck around 220 volts, low enough to be safe and high  enough to be efficient with high-power appliances (like stoves). Nearly all countries (including the US) use 220 volts as the basic service into the house. In the US we also use 110 for low power applications, such as light bulbs and electronic equipment, where current is low enough for power loss in the cord to be negligible. It turns out that these products can be made a little cheaper, if designed to use lower voltage. For example, the filament in a 220V/100W light bulb would have to be thinner and longer (therefore more expensive to fabricate) than the filament in a 110V/100W bulb of equal life expectancy. This was a bigger issue in the early days of light bulb manufacturing than it is now, but Americans are used to 110V outlets, and changing everything to 220V would (for no good reason) scare the heck out of us! What about frequency! You'll be glad to know that you've just learned most of what you need in order to see where the frequency standards came from. Let's follow our same approach and ask:  What's wrong with real low frequency? What's wrong with real high frequency? Then we'll try to see what a good compromise would be.The lowest possible frequency is 0Hz or direct current (DC). What's wrong with that? Why not 110 or 220 volts DC? The answer begins with the same reason we found for using higher voltage with the stove. You just need to think on a bigger scale. Imagine a big city. It uses lots of electric power, so it's kind of like a high-power (VERY high-power) appliance! It gets that electric power from a huge power plant (usually several huge power plants) located tens or even hundreds of miles away. The power is delivered through wires from the plants to the city. So think of the city as a big stove, the power lines as a cord, and the power plants as the wall socket. Our real stove needed 10,000 watts; a big city might use 10,000 million watts (10,000 megawatts). That's a million times as much power as the stove. If that much power were delivered at 220volts, the current would be more than 45 million amps (I = P/V), and we  know that when electric power is delivered along a wire, high current means lots of power being wasted in the wire (because P = I^2*R) Now 45 million squared is over 2000 trillion! And remember, we're trying to get 10,000 million watts to the city. So even if we could keep R down to a millionth of an ohm (a micro-ohm) efficiency would be only a littled over 80%:
efficiency = (power to the city)/(power to the city + power wasted) = 10,000 megawatts/(12,000 megawatts)

What would a 100 mile long, 1 micro-ohm, copper wire look like? The equation for resistance of a copper wire is:
R (ohms) = (1.5E-6)*Length /(pi*Radius^2)

where the length and radius are in centimeters. I'll let you figure out the radius. But it's way too big to be practical!

Fortunately there's a better way, and we know what it is: deliver the power (P=I*V) at high voltage and low current instead of low voltage and high current. If we increase the voltage by a factor of a hundred, the current could be reduced by a factor of 100 for the same 10,000 megawatts of generated power. But that reduces the power lost in the lines by a factor of 10,000 (because power varies with current SQUARED) so most of the 10,000 megawatts could actually get to the city! That seems too easy! And you've probably figured out the catch: multiplying 220 volts by 100 means 22,000 volts. We don't want that on the utility pole in front of our house--to say nothing of letting it inside!So here's how it works. For most of the distance from the plant to your house, power is delivered at tens of thousands of volts. That would be dangerous if people could get close to it, so the power lines are suspended on those huge towers that hold them way up in the air. When the lines get to your house, the voltage is reduced (stepped down) to 220 and the current is increased (stepped up) from what ever it was to whatever you need. Actually it's a little more complicated than that because your house isn't the only place the power is going. So the voltage gets stepped down a couple of times, first at a sub station, to around a few thousand volts, then at the utility pole in front of your house to 220. There's only one cost effective way to "step down" a voltage without wasting power and that's with a transformer. These are placed on utility poles close to your house. The input to the transformer is high voltage/low current (from the power station), and the output is low voltage/high current (to your house). There's just one last catch; transformers have no moving parts, and use electromagnetic induction. Without moving parts, there's no such thing as electromagnetic induction with constant (direct) current! It has to be AC, which rules out 0Hz!
Ok, how about 1Hz? Just kidding! Actually transformers do get more efficient with increasing frequency, but around 20Hz they can be made very efficient. By the way "efficient" here means that the power out of the transformer (to your house) is very nearly equal to power in (from the power station). The reason for higher frequency has to do with lightbulbs. At 20Hz,  oscillations in brightness are noticable (and annoying!) even with incandescent light bulbs. Flourescent bulbs actually go completely on and off with AC, and at 20Hz this flickering would be extremely annoying! Flickering goes away around 50Hz, the standard used in many countries. The 60Hz standard adopted in the US comes from the use of the periodic voltage as a timing mechanism for electric clocks (60 minutes in an hour, 60 seconds in a minute, so 60 cycles in a second).There doesn't seem to be any advantage to frequencies above 60Hz, and there are several disadvantages. They would require that generators turn faster, or have more parts. In other words, they'd be more expensive. Also, wires with alternating current flowing through them emit electromagnetic radiation, and it turns out that the power radiated away (that is, wasted) through this process increases with increasing frequency (this is exactly the same physics that makes transformers only work with AC).

Well, I bet you never thought the answer to your question could be so long! And since you posted it in the "Science History" area, I should finish with a reference to the history of these standards. It's a very interesting story of American industry involving perhaps two of the greatest inventors of all time, Thomas Edison and Nicola Tesla.


3. The supply of electric power to our houses from generating stations ismainly in the form of alternating current(a.c.). However the lossesexperienced along the path of travel from the central power grid station to the sub-stations and then on to the distributors arephenomenal. This loss is dependent on the frequency of the a.c.supply. Along the path there are transformers, transmission cables andcores. The loss of energy in these parts depend directly on thefrequency irrespective of whether the voltage is being stepped down or up.Note: Static hysteresis loss is proportional to frequency. An equationcalled Steinmetz equation can be employed to arrive at the fact that 60 Hz supply causes more dissipation of heat and energy than 50 Hz systems. Hence it is not preferred by many countries. The losses being proportional to the square of the frequency, is hence very high for 60 Hz systems.

Now to understand the geographical areas of usage consider this extract,"North American 110-120 volt electricity is generated at 60 Hz.(Cycles) Alternating Current. Most foreign 220-240 volt electricity is generated at 50 Hz. cycles) Alternating Current...tape and CD players, VCR/DVD players, etc. will not be affected by the difference in cycles. IMPORTANT: Voltage converters and heavy duty transformers do not convert cycles."
Modifying Foreign electricity: "What You Should Know About Traveling Overseas With Electrical Appliances" Copyright © 2001 Hybrinetics, Inc.http://www.voltagevalet.com/foreign.html  there are no clear advantages of
60 Hz over 50 Hz except in a few cases like the one on video graphics I have considered below. Similarly 50 Hz supply which is used in India also is not of much advantage. The two systems are now standards and a transformer is requried to step up or step down from 220 to 110V or vice versa. Sometimes the difference between the two requencies are not significant.In general, the three phase motor at 50 Hz can run also at 60 Hz, increasing the voltage of 15%, the motor at 60Hz can run also at 50 Hz decreasing the voltage of 15%, but without the tolerance ± 10% expected in the version at 50 Hz.
In a motor, Eddy current losses can be examined as follows, Pe~f**2, i.e. they will increase by 44% from 50 to 60Hz.  note: ~ indicates 'proportionality'
For Hysteresis losses (C.P. Steinmetz Equation/Law): Ph~f**1.6, i.e. they will increase by 33.9% from 50 to 60Hz. The exponent varies from 1.4 to 1.8; however, it is generally accepted as 1.6.  These losses described above produce heat that has to be removed. Accordingly one might be forced to conclude that 60 Hz systems are more lossy. However if you consider the output power and efficiency, 50 Hz systems are at a very slight disadvantage.  More on this can be found here, an extract: "Pm(50) is the mech power required at 50Hz. Pm(50)=T*wm(50) where T is the torque and wm(50) is the angular speed at 50Hz. At 60Hz, wm(60)= (60/50)*wm(50) = 1.2*wm(50) so the motor runs 1.2 times more than at 50Hz....If the efficiency is constant, then A(60)= (1.2/0.9) A(50)=1.33 A(50)the apparent power increases of 33%..." Eng-tips Forums: "50 and/or 60 hz motor running suitability." Initial post by dubairay. Post by Alex68 on Sept 12, 2002.

An article with reference to effects of frequency on video display can be found here,

Additional Links:

If you are mathematically inclined, you can see how the results in the first part of the answer were reached here:

A good paper which considers the effect of frequency differences on transformers can be found here,

Another paper on low frequency (30-80 Hz) trnsformers can be found here


Search Strategy: 50 Hz 60 Hz difference,50 Hz 60 Hz advantage(s)
4.
Subject: Re: AC Power Supply: 50 Hz or 60 Hz?
From: neilzero-ga on 19 Nov 2002 17:07 PST

A very few people are adversely affected by floresent tubes powered by 60 hz. The number may double at 50 hz, but I am speculating. Almost everyone is adversly affected by a single floresent tube powered by 25 hz or lower frequency as it perceived as a stobe light. Two or more phase shifted floresent tubes reduces this problem. The construction of the world's most powerful alternator would be more difficult and costly at 60 hz than at 50 hz, but the cost is essentialy the same for alternators almost that powerful. On the average, motors and transformers for 50 hz are slightly heavier and slightly more efficient than the same rating for 60 hz and the motors typically run at 5/6 the speed of 60 hertz moters. This can be either a minor advantage or minor disadvantage. The pain of electric shock increases as the frequecy is lowered, possibly the probability of death. Hum is more dificult to reduce, but the  sensitivety of the human ear to hum decreases with frequency, so that is about a tradeoff.
5.
Subject: Re: AC Power Supply: 50 Hz or 60 Hz?
From: roval-ga on 27 Nov 2002 10:52 PST

Do you want a clear answer: “Yes, this frequency is better than the other one!”? There is no such answer! You can go in details (in different areas of electrical activity) and find advantages and disadvantages. shivreddy-ga explains how the loss of power increases for 60Hz systems comparing with 50Hz systems. We can say that is better at 50Hz. But talking about fluorescent tubes it is preferred to have 60Hz (harder to see the flickering). And, considering the explanation you received about the motor designed for 50Hz and running at 60Hz, I can tell you that is pure  speculation. Nobody (that considers himself a professional) will ever do something like that. You CAN mix them up but it is NOT RECOMMENDED to do it! So which one is better? Overall talking they are about the same.
But, if they have no clear advantages over the other one, why do we have two of them? I don’t know exactly, but it is very possible to have “historical” reasons! It is very possible to have political reasons: somebody decided to
have a different value to prove he is different comparing with somebody else (ex: USA – England, 200 years ago). And being different can be interpreted as independent. It is possible to have “economic” reasons: somebody decided long time ago to interdict certain products coming from a country they didn’t like (to "protect" the market). Having a different standard was an easy way to justify such a "protective" measure. But all these are politics. Let’s come back to technical stuff. You asked about frequency differences. What about voltages? Here there is a clear answer: 220-240V is better than 110-120V. There are different reasons. Major one is power loss. When running 120V you have 4 TIMES more power loss in the distribution grid comparing with running same equipments on 240V. Plus you need thicker wires everywhere, better switches to connect / disconnect equipments, etc. As (almost) everybody knows: Power = Voltage x Current. For the same power: the lower the voltage – the higher the current. And, higher the current, more problems you have. Conclusion: There is no clear difference between 50Hz and 60Hz from a technical point of view.
6
Subject: Re: AC Power Supply: 50 Hz or 60 Hz?
From: lookfwd-ga on 14 Dec 2002 17:34 PST

There is something we should always have in mind. There may be no important difference between 50Hz and 60Hz but it is more than important to be standardized and stable. There are a lot of devices that are optimised and in fact working in either one of these frequencies. One of the most important is video, which sometimes retrieve their clock frequency from AC voltage and some clocks that do the same. I live in Greece and the voltage level
is 240 V although typically it should be 220V and the frequency is not so stable as well. This causes a lot of trouble. Some PC power supply units get burned, some clocks don’t work and of course a video player that came (with the appropriate TV set) from Canada, couldn’t work although we have provided it the 220V to 110V  transformer. As you can see its very important to hold the voltages and frequencies between the typical levels.

* You may think that 240 V is typical but we shouldn’t forget the 5% error rate. This means that once having 240 V AC you could also have 20 seconds with 245 V. This is something that many producers forget for their money’s sake.

Other Experiences-
Further


No comments:

Post a Comment