Hello all,
The design competition is over but I had promised to shed light on a fenomenon I had seen in my circuit. Although this is not really related to wireless electronics, it is related to power limited sources.
Voltage and current limited boost converter
When designing the 6-cell charger I was looking for a way to boost the Qi voltage (5V) to a suitable charging voltage. I chose the TPS61165 LED driverTPS61165 LED driver which is a boost converter to do this. This driver operates as a current limited boost converter. Now what happens when you remove the batteries (accidetally or on purpose)? The voltage goes sky-high! The driver IC tries to increase the output voltage to increase the output current, which obviously stays zero because no connection between power output and low side sensing exists. In this driver IC, the output voltage is limited to ~40V by an internal monitoring circuit. But... to switch the output of the current source to the batteries on and off I wanted to use small FETs, with low Rds-on, and these were 20V-type FETs. Also, I wanted to prevent blowing up the bq2002T which is monitoring the output voltage. Therefore, I limited the output voltage to approximately 9V with help of a zener and a 10 Ohm resistor:
The theory of this circuit is that the regulator tries to keep the FB pin voltage at 200mV, which is normally (when the zener is not conducting) done by maintaining the current through R10: 200mV/.43Ohm= 0.465A. PAD3 is the negative side of the battery, so a constant current of 0.465A is used for charging. When the output voltage goes above 9V the Zener starts conducting, and rises the voltage at the FB pin, thus limiting the output voltage.
When I was testing, I found that the batteries were mostly charged with a lower current; 0.4A maximum. I thought about this, and realized that the zeners' turn on is not EXACTLY at 9.1 V, but it starts conducting a bit earlier. a few microamps x 10Ohm(R11) is a few millivolts contributing to the feedback voltage, which decreases the output voltage. My next move was to use a larger value zener diode, a 10V type. Now things started to get interesting; instead of charging, the Qi transmitter started blinking a red LED and refused to give power.... What happened?
In increasing both output voltage and output current, I had increased the output power of the circuit. With the 9.1V zener I was running ~9Vx400mA at the output which is 3.6W. With a 10V zener I was trying to get something like 9.5Vx0.46A = 4.37W. Should be no problem with 5W output power from the Qi, right? Nope. Wrong. With switching converters, you have to deal with the efficiency, and the conversion between input / output voltage and current. With an output voltage of 9.5V, the voltage is multiplied by 9.5/5 = 1.9. Because the output power has to be generated by the input power, the input current will have to be 1.9 times larger than the output current; in this case 0.46*1.9 = 0.87. Unfortunately, the converter has its losses. When running at 85% efficiency the input current is 0.87*(1/0.85) = 1.1A. Aha! 1.1A continuous is at the upper limit of what the bq51013A can deliver, and since I have some more chips running at the 5V, and some LEDs, and some pull-downs, the summed current is too large for the Qi receiver -> shut down.
The good news
The good news here was that with the 9.1V zener in place, the circuit works! I realized that by making a voltage and current limited supply, I had in fact made a power limited supply! To be honest, I hadn't designed it with that feature in mind. When charging the 4AA's of the robot, I had the full current output, because the charging voltage was approximately 6V. The zener didn't conduct a bit and thus all regulation was based on current -> verification of current setpoint!
What is surprising about this, is that the efficiency / charging time of the battery charging depends on two major factors; the slope of the V/I curve of the zener (from which voltage will it start conducting, and how many mA will it conduct more for every mV that the voltage rises?) and the internal resistance of the batteries! This latter factor is quite interesting to realize; when the internal resistance rises, the charging current will cause a voltage rise of internal resistance x charging current above the 'open clamp' battery cell voltage. With 100mOhm internal resistance this will only be 6 cells x 100mOhm * 0.5A = 0.3V; but with bad batteries with 0.5 Ohm internal resistance this will be 6*0.5Ohm*0.5A = 1.5V. This means that old batteries will sooner reach the point where the output power is limited due to the higher charging voltage!
Verification
I found this combination of factors so intruiging I setup a small simulation circuit; if you use switchercad you can try it for yourselves; otherwise please take a look at the schematic, you'll be able to redraw it in any simulator of your liking. In this case I used the model for the 9.1V zener, and added a voltage source under it to show the difference between the 9.1V / 10.1V output voltage limit.
The simulation (click here for full size screenshot) shows the output current (top graph) and output power(bottom graph) for both situations, with cell voltage plotted over the horizontal axis. The R3 which is used for current measurement is a 1uOhm resistor in series with voltage source V2, now hidden by the dialog showing voltages and currents. You can see that when taking efficiency into account, the output power is rising beyond the 5W limit with 10.1V output limit. The 100mOhm chosen is very optimistic, in most cases it will be at least 0.6Ohm, which has the effect that the output current will be limited even sooner.
For those interested I attached this simulation file to this post (click on header).
Conclusion
It doesn't happen often, but sometimes a not intended feature DOES work out in a good way!