Hi Fellow EE's,
I have an old buck converter design that uses an off-the-shelf current mode controller IC . A sense resistor is used to measure the inductor current and compensate accordingly. The original design is rated for a current that is much higher than necessary. As a result the efficiency is close to 50-60% with the typical small load used in the application
Since the sense resistor was chosen for a current about 14 times greater than the nominal in the application, I'm considering using a larger sense resistor to bring the current range closer to the range of currents drawn by the actual load. This is per the datasheet formula that relates the Rsense value to IOUTmax. Will this ultimately increase my efficiency, and why?
I have done an experiment in which I substituted a larger sense resistor value and the efficiency improved about 10-20% with no other changes. I really would like to know the mechanism behind this.
Thanks,
New Power Supply Designer