A question for which I have been getting conflicting answers on the web: would someone be able to provide a mathematical proof - in terms of calculations?
Most ceiling fans in India have a regulator (two types - a resistance based one and an electronic one) which controls the speed of the fan. I have all across believed that running a fan at a lower speed takes up more energy as a good amount is wasted in the regulator (resistance-based) in the form of heat. But some argue otherwise.
Does someone have a mathematical proof for this - either than it saves energy or it does not? breaking the answer for Resistance based and electronic one too is fine...
TIA...
I did a small experiment a few days back - the power went out and the UPS kicked in. I switched off all the lights and fans and then switched on 4 fans at low speed. The 500W capacity UPS showed 50% load when the 4th fan was switched on. Then I switched off 2 fans and the load dropped back to 25%. I then gradually increased the speed of the fans. When both the fans were at their top speed, the load kicked back to 50% - which at an initial level goes on to prove that running a fan at lower speed is sure to take lower load and use lower power. However, I have the small socket type (electronic) regulators at home. I am not sure if the same holds true for resistance type regulator. So if someone has started a research on this, please continue.
No comments:
Post a Comment