RoadTest: MAX17260GEVKIT# Fuel Gauge EVM
Evaluation Type: Development Boards & Tools
Did you receive all parts the manufacturer stated would be included in the package?: True
What other parts do you consider comparable to this product?: TI offers Fuel Gauge chips too.
What were the biggest problems encountered?: No issue. However, it would be nice to see documentation on how to use different thermistors, other than the ones in the user documentation.
In “Do the Right Thing” one of the characters loves his music. But he desperately needs twenty “D” Energizers when his boombox grinds to a halt part-way through his favorite song. Mega frustration!
The MAX17260GEVKIT being reviewed here is a kit that is used to evaluate the Maxim Fuel Gauge chip, and as the name suggests, it has something to do with measuring the amount of charge in a battery. The MAX17260 based kit is for single-cell Lithium designs however Maxim also manufacture a MAX17261 chip that is near-similar, and is designed for multi-cell batteries. Regarding part codes, Maxim product numbering has got crazy over the years.. I remember when Maxim product codes had just three digits : ) The MAX17260GEVKIT is a mouthful, so it will be abbreviated to “Fuel Gauge Kit” throughout the review.
The amount of charge in a rechargeable battery connected to a circuit can be hard to predict. The information is important because it tells you how long before your equipment needs a recharge. A good system may also allow for estimation of other information that would be useful as well, such as the health of the battery.
While a boombox may not always be critical, clearly knowledge of the amount of charge is highly important for other modern uses with batteries; in electric vehicles for instance, where successful arrival at a location is critical to be able to recharge, or even swap out gear like in Formula E racing, as inspired by the Pony Express! It’s quite important for high-end equipment like laptops, video cameras, power tools, and of course mobile phones.
The interesting algorithm that is used inside the Fuel Gauge chip is claimed by Maxim to produce extremely accurate results concerning charge state (to within a few percent!) so I was excited to explore this.
It turns out that the algorithm is extremely accurate, granular (no more three-bars for battery indication; you can have it displayed in percentages) and, surprisingly easy to use. Often new technology can be complicated to use, but Maxim have made it simple for anyone to design into their products, with almost zero knowledge required about battery chemistry and parameters.
A simple approach would be to just measure the battery voltage and map it directly to a perceived state of charge. A lower voltage would mean less charge. Most low-cost consumer electronics equipment will implement such a method. The voltage is compared against levels (using either comparators, or an analog-to-digital converter in a microcontroller chip) and used to show bars in a battery icon on the display. This may seem to be a good method but in practice it’s not, because nearly all rechargeable battery technologies have a nearly zero voltage gradient over much of the discharge curve (see the orange dashed line in the diagram below) and internal resistance. As a result, measuring the voltage and directly mapping the measurement does not tell you accurately where the battery is on the discharge curve because due to battery internal resistance the implementation can have messy effects when the load changes temporarily (such as the boombox; as soon as there is any heavy bass, the low battery charge state indicator may flicker). The simple mapping system has no idea if the open circuit battery voltage is being measured, or if the battery is under load.
Instead, more capable fuel gauge systems will attack the problem in different ways. Toward the end of the 20th century, the established method was to measure the amount of charge egressing or ingressing the battery usually by measuring the current using a sense resistor. The direction of the current flow is used to determine if the battery is being discharged (i.e. it is being used), or charged. The amount of charge depends on the amount of time current flowed for. In other words, the measured current needs to be integrated over time. In practice this can still have inaccuracies however, because if there’s any error in quantization, or current changes too rapidly (i.e. sample rate is too low), then over time, the amount of calculated charge will drift from the actual amount of charge. The drift can only easily be corrected at known charge levels (e.g. when the battery is fully charged or discharged). If the battery is never left to fully charge or discharge, then the error just accumulates.
Fuel gauge systems can also do other things for better battery modelling. For instance they can gather information during known full charge and discharge cycles and store in internal EEPROM, and they can monitor the amount of charge cycles, battery temperature, and other parameters, to more accurately determine the amount of charge remaining as the battery ages over the years, and sometimes even as the battery self-discharges (the systems that do not measure self-discharge, will have an error of (say ballpark) 10%.
Maxim took an interesting approach to measuring amount of charge. The first version of their algorithm dispensed with the sense resistor, and exclusively just measured battery voltage, like the simplistic method mentioned earlier. However, the measured value was not directly mapped into a charge value. Instead, the ModelGauge algorithm observed any sharp fluctuations in the measured voltage, and assumed that these were due to the battery being loaded, and tried to estimate what the actual no-load battery voltage would be, and correct itself whenever rapid fluctuations decreased and the battery voltage was at its highest in a certain space of time, which would indicate that the battery was now not being loaded. By doing this, the algorithm had an internal model of what the no-load voltage of the battery was at all times, regardless of if the battery was loaded or not. This algorithm did still exhibit error, but had the advantage that it was simpler to implement since it did not need any current sense resistor, and would also be slightly more energy-efficient too as a result. With circuitry that switches between load and no-load (or very close to no-load), then this algorithm would work reasonably well.
The ModelGauge algorithm was improved over the years. The current ModelGauge m5 algorithm (as used in the supplied Fuel Gauge Kit) combines the original ModelGauge, with the charge (Coulomb) counting method. It means that it has the benefits of charge counting, but without so much integration error over time, since it can try to correct itself regularly using the algorithm’s internal estimation of the no-load voltage. ModelGauge M5 stores many parameters, and mashes them together to build a more accurate model of the battery, including battery ageing, to make more accurate estimations of the state of charge, but also to make an estimation of the age of the battery too.
Fuel gauge systems can be used in several ways. One way is that they can be embedded in the product, if the product contains an internal rechargeable battery.
Alternatively, if the battery is removable, then the fuel gauge circuit can be embedded inside the battery pack. That way the battery pack is responsible for retaining the charge information, even when the battery pack is disconnected from the product and perhaps charged separately. For the product to be able to read the charge information, extra contacts are often used on the battery pack.
Yet another way to make use of fuel gauge systems is to integrate into the battery pack along with a microcontroller and a push-button, and some LEDs. That way, the battery pack can be examined any time by the user, without needing to plug it into the end equipment. This is popular with power tool batteries (such as electric drills).
By the nature of where fuel gauge systems circuitry is located, it can also make sense to integrate additional functionality, such as battery authentication capability, to prevent operation with an unknown manufacturer battery.
There are two boards in the kit, plus a two-meter cable to connect them together.
One board contains the actual (PDF datasheet for the MAX17260 chip), and a current sense resistor, and not much else. The RJ11-style (i.e. telephone-style connector, but with 6 pins wired) connector exposes an I2C serial interface from the chip. The other board is like a USB-to-I2C gateway. It allows you to connect to a PC, and send or receive I2C data.
The diagram below shows how the bits are connected up. The Fuel Gauge chip can use a high-side, or a low-side sense resistor, and there’s a jumper on the board to select which one you want to use.
The Fuel Gauge chip has an in-built temperature sensor, which is used to more accurately estimate charge based on battery temperature (if the chip is placed in close thermal contact with the battery), otherwise an external thermistor (10k or 100k) can be attached.
The chip comes in a small 3x3mm package (0.4mm pitch TDFN) but just-about manageable with hand tools for prototypes. There’s an even tinier 1.5x1.5mm package option if required.
The supplied USB-I2C adaptor is actually quite nice, but there’s not a lot of information on it. Lots of googling eventually revealed that it looks like a USB HID device to the connected PC, and instructions can be sent to it to perform low-level I2C activity such as an I2C ‘start’ or ‘stop’ condition. Curiosity got the better of me and I tried controlling it via Linux, and was successful in sending a few commands and with an oscilloscope I could see the I2C lines doing their thing. With some effort it could be turned into a general-purpose I2C test tool if desired.
However, the purpose of the kit is to evaluate the Fuel Gauge chip, and for that Maxim supplies a PC application that will control the USB-I2C adaptor.
In order to try out the Fuel Gauge Kit, I of course needed a battery, a charger, and some way of discharging the battery. It was decided to assemble the test bed topology shown here.
I built up the test-bed on a block of wood, so that nothing would accidentally short. I didn’t want to experience exploding batteries.
The battery had the following specification:
|Type||Li-Ion Polymer, Single Cell|
|Capacity||1000 mAH typical|
|Impedance||< 150 milliohm|
|Constant Current (CC) to 4.2V, followed by 4.2V Constant Voltage (CV), and terminate when the current drops to 10mA|
|Built-in cell protection||Cut-off at 3.0V and 4.2V|
As a first step, I had to build a charger. I decided to charge the battery at 333mA constant current. I used a MCP73831T-2AC charger chip, which can be programmed for 333mA by using a 3kohm resistor connected to one of its pins. I followed the datasheet circuit, I didn’t do anything special.
The Fuel Gauge accuracy depends in part on how accurately the battery temperature is known. This gets a little tricky, because the Fuel Gauge chip has an internal sensor, but it is not positioned close to the battery since it is on a dev-board. In a real design, the chip could be on a PCB pressed close to the battery. The Fuel Gauge board has an external thermistor too, already soldered on the board. I decided to desolder it, attach it to wires, and then place it on the battery. The thermistor is a tiny 0402 sized part, so a lot of care was needed not to lose it : )
Incidentally, in a real design, if a thermistor is used, then it’s highly recommended to use the same one as on the dev-board. The reason is, some coefficients need to be configured via I2C to the Fuel Gauge chip, and the information on how to calculate the coefficients was not available anywhere – Maxim may be able to provide this information, or explain how to experimentally derive the coefficients. Anyway, the thermistor on the dev-board is a very popular one (and low cost), I had no issue ordering a few from Farnell/Newark for future experiments. A couple of alternatives are also listed in the Maxim documentation.
I used thermally conductive ‘Gap-Pad’ to cushion the thermistor, to try to provide better contact with the battery.
Upon first power-up and running the PC software, the user is prompted to enter some battery parameters. Not a lot of information is required to get the Fuel Gauge going!
These were the only settings that I needed to enter. The Fuel Gauge chip needs to know the attached battery charger’s Charge Termination Current, because it uses that to detect if the user has aborted charging early, or if the battery is fully charged. The particular charger chip that I used happens to terminate the charging process (i.e. switches off the CV mode and internally disconnects from the battery) at a maximum of 9.4% of the normal charging current (333mA). So, that’s 31mA. The value that the Fuel Gauge chip needs (NOTE: this is what initially assumed, but see later! in the section "Charging the Battery") is 1/1.25 (i.e. 0.8) times that, because the chip detects charge termination within a certain band (between 0.125 and 1.25 times the configured value). So, I entered 0.8 x 31mA, i.e. 24.8mA.
Next the software warns that the Fuel Gauge chip has experienced a ‘Power On Reset (POR)’. That’s because the battery is being connected to it for the first time – before that, the chip was unpowered of course. I clicked ‘Yes’ to ensure the settings were uploaded to the chip via I2C, since it does not store any settings in a fully unpowered with battery disconnected state.
The displayed information is impressive. All of it is useful and worth exploring, but some of the immediately interesting ones are highlighted in red. At the top, there are multiple tabs and you can get graphs updated in real time, or view the I2C register settings (again these update in real-time if they are results registers). There’s close to 100 different registers, so there’s very rich information.
The Time to Full (highlighted in red) reports as N/A, meaning that the battery is not being charged yet. On the right, you can see the current and average battery voltage, and the temperature (using the thermistor if you click on the Configure tab and select it). The current is reported too, based on the voltage across the sense resistor. It’s a 16-bit register, and the value in the screenshot indicates that it is likely there is no load attached, or a very small load, since the average current reading is a fraction of a milliamp.
The blue highlighted items above are very cool. The alerts section is used to program the Fuel Gauge chip via I2C to detect different conditions (you can set thresholds for them too), and drive an open drain ALRT pin to ground if any of the conditions occur. You can then read the various registers, and decide what to do (e.g. flash an alert to the screen on the end product, or auto-power-off until the battery is back in a safe temperature zone for instance).
Also, on the left side, is something extremely interesting. The AtRate register can be set to a hypothetical value that represents current, and then the AtTTE register can be read. It will report back how long the battery will last for, at the hypothetical AtRate value : ) This could be extremely useful for battery-powered computing systems. An example scenario could be the need to check if there is enough power to save data prior to shutting down. Incidentally for Linux systems, Maxim supplies the relevant drivers, so that the operating system can make use of the information from the Fuel Gauge.
An extremely useful part of the software is the Information tab. There, you can see a timestamped list of all the I2C registers that are written to. So, if you make a change in the settings, you’ll always be able to see a history of the precise registers that were modified, so that you can then implement it in your code for the end product.
The software can also chart information in real-time (see the next section) and allow saving of data to a comma-separated-variable (CSV) file.
In summary, I found the PC software to be truly excellent. It is feature-packed, and it’s well laid out. The dial gauges and the graphs are nicely implemented. The engineers had fun developing this software : ) I loved that the software makes it so convenient to explore the registers, but also to be able to see precisely what I2C commands the end product will need to issue too. The PC software is a massive time-saver for developers to deploy the Fuel Gauge chip in their products.
I’d not used this battery since I’d purchased it five years ago. I’d bought it for a project but then used a different battery. I had no idea what state the battery was in. Thankfully, it seemed fine – the initial open circuit voltage was reported by the Fuel Gauge chip to be 3.83V, which is comfortably within the expected value range for such a Li-Ion Polymer cell.
I plugged in the 5V DC adapter, and watched what the Fuel Gauge reported as the battery was charged for the first time.
After an hour, the charger auto-switched from CC mode to CV mode, and then it took almost two more hours before the charger switched itself off at about 28mA.
I was happy with the operation of the charger circuit, it seemed to be functioning fine. The Fuel Gauge PC software automatically plots in real-time some of the dynamically-updating register contents. It was a nice sanity check to confirm the charger worked. This was the first time connected to the Fuel Gauge, and no complete charge cycle was done, since the battery already had some charge on it. In the screenshot below, some of the registers related to Capacity show some self-corrections going on part-way. These are interim measurement registers; they are not normally end-use registers.
I was curious if the Fuel Gauge chip had detected that the charger had finished its job, since that is important for the highest accuracy. Looking through the documentation, I found a register called FStat, which contains state bits (for the state machine inside the Fuel Gauge chip). Looking at the register contents around the time that the charger completed, I could not see the relevant bit had toggled : ( This meant I had not left enough threshold for the algorithm to detect end of charge. So, I went into the configuration and set the Charge Termination Current value slightly higher, at 31mA instead of the previous 24.8mA. It seems I’d misunderstood, and it is recommended to set the value to the value that the charger is configured for, rather than attempt multiplying by 1/1.25.
I was concerned that changing the register value on-the-fly could invalidate the learned information now stored in the chip unless I disconnected it from the battery to perform a power-on-reset. However, I decided to continue without a reset, to see if it caused any significant issue (it didn't).
After this first charge, the Fuel Gauge reported close to 100% charge, and a capacity of 1046mAH. This could have an error, since the Fuel Gauge had not seen a full charge and discharge cycle yet. To check the error, the battery needs to be discharged in a controlled manner, to see how much charge can be extracted from it.
The BK8600 DC Electronic Load has a battery test capability. With just a few button-clicks, it can act as a constant-current load, and it will report the capacity as a mAH value dynamically. It will automatically stop and disconnect itself once the battery is discharged (thresholds can be set for this). So, I set it up to automatically discharge at a constant current of 1000mA, and to terminate at 3.05V (the battery protection circuit will disconnect at 3.0V, so I just picked a voltage close to this).
As expected, the battery discharged for about an hour until it was depleted.
At the end of the discharge, the battery voltage slowly recovered, and it could be seen from the PC software that the Fuel Gauge reported 4.746% capacity remained : ) This sounded about right, since I’d only discharged to 3.05V, rather than the 3.0V that I’d configured in the Fuel Gauge chip for the the empty voltage setting. At a constant drain of 1000mA, that would only allow the battery to last for another 3 minutes, to put it into perspective.
In summary, it seemed that the Fuel Gauge chip was behaving correctly, and the results seemed pretty good so far. However, this first test revealed nothing about how accurate the fuel gauge was during operation; it merely showed that the chip reported a reasonable 4.746% remaining capacity value after discharge.
So, now that the fuel gauge chip had learned a little about the battery, I decided to fully charge it, and then observe the reported remaining capacity at fixed intervals during the subsequent discharge process.
The battery was fully charged (from the 4.764% capacity from the previous test) to 100%. This time, with the modified register setting mentioned earlier, the Fuel Gauge chip correctly detected when the end-of-charge condition occurred. The screenshot below shows in blue when that occurred (I know this from looking at the register values in the logged CSV file; the FStat register in the CSV file contains this information). After around 15 minutes of that detection, the charger chip powered itself off.
Now that the battery was charged, the Fuel Gauge chip reported that the capacity was 1053 mAH. I set the BK8600 to perform a test at 1053mA constant current, for a period of 12 minutes. This should result in about 20% of the capacity being used up, i.e. 211 mAH. I repeated this another four times, until the battery was fully depleted.
The two-minute video here shows the end of the test, as the battery voltage was approaching 3.0V, at which point the BK8600 was programmed to terminate the constant current discharge test. Apologies for poor camera-work.
And here are the charts from the PC software!
In theory the result should be that the Fuel Gauge chip reports remaining capacity levels of 80%, 60%, 40%, 20% and 0%, and the results were pretty close! Within 3% for all these five points in time that it was measured. The final remaining capacity was 2.7% and not 0%, and this could be due to the battery discharge test terminating slightly earlier due to voltage drop across the load wires (I did not use separate sense wires to the electronic DC load for this test). That 2.7% would have resulted in the battery lasting another minute and a half at the 1053mA discharge rate.
In conclusion the results were impressive. The reported capacity from the fuel gauge chip was extremely accurate. Products using this chip could benefit from a more sophisticated display than just a few LEDs showing basic charge status, due to the accuracy and granularity of the reports.
More tests could be done, in particular those where the battery is only partially charged and discharged several times, to observe that no drift occurs in the measurements, and also tests after the battery has been charged and discharged dozens of times, to see that the cell ageing is learned by the Fuel Gauge chip too. Due to time constraints, it was not feasible to do this for the RoadTest (each test takes half a day with the current charge and discharge rates). For developers interested in creating combined battery and fuel gauge solutions, it would be worthwhile running tests simultaneously on (say) half a dozen batteries and Fuel Gauge chips, to speed up the entire test period.
The Fuel Gauge evaluation kit is invaluable to be able to experience the capabilities of Maxim’s ModelGauge m5 algorithm. It’s a well-designed kit and software combination.
In the past the ability to accurately measure or estimate the charge inside a battery was complicated and inaccurate. Many systems did not take into account battery self-discharge, or relied on complete charge and discharge cycles in order to reset drift in reported values. The ModelGauge m5 algorithm would appear to solve all of these issues through the combination of coulomb-counting using a sense resistor, plus open circuit voltage estimation. An exhaustive test of the algorithm is not possible without a lot more testing, but it was impressive that even after the first incomplete charge and discharge of an old battery, the learned data was already good enough for the subsequent charge and discharge cycle to have accurately reported remaining capacity information, to well within 3%. I also liked that it was possible to optionally use a thermistor to make the algorithm perform well. There are also other capabilities beyond the core capacity reporting, such as the ability to predict how long the battery will last for a given load, although this was not explored.
The usability of the algorithm is excellent. Very few inputs are required to be configured by I2C, and the Fuel Gauge chip will then take over and deliver detailed information. I’ve been impressed enough by the technology to want to develop future solutions with it. In fact using Fuel Gauge chip is likely to be simpler than implementing a low-power indicator in any other way. Very few components are required – just a couple of decoupling capacitors, and the sense resistor. The thermistor is optional, the chip has an internal sensor too.
It was super-interesting evaluating this kit, and I hope the information here has been useful to explain the technology, and how to go about exploring it and using it in custom product designs. Thanks for reading!
Great writeup Shabaz. I really like the background you give on the technology and descriptions of how it works. Nice piece of kit too.
Thanks for the nice comments! I was learning a lot during the review. The last time I tried a fuel gauge solution, the state-of-the-art was quite poor, and I'd not explored it since then.…
Very nicely conducted review - well done.
I can imagine using this part in the next iteration of one of my customers' plans.
Very nicely conducted review - well done.
I can imagine using this part in the next iteration of one of my customers' plans.
Great roadtest on this interesting part. Battery charge level is an interesting problem, which as you reported, has been difficult to get right. I have worked on consumer products with Li-Ion batteries and getting reasonable 'charge remaining' values have been quite difficult. It looks like there is a lot to learn from this part and your roadtest, so I will bookmark this latter reference/review.
It was in the back of my mind to list those, but forgot. Here's the options (and there's a message that Maxim can be contacted for types not listed):
These convert to a register programmed value of 0, 2 or 6, so I'm guessing there could be unlisted types for specific unusual cells perhaps.
A very well written and informative roadtest. Did you note what different battery models were available when setting up the battery initially?
Thanks for the nice comments! I was learning a lot during the review. The last time I tried a fuel gauge solution, the state-of-the-art was quite poor, and I'd not explored it since then.
At that time, the solutions could not take into account self-discharge when the battery pack was disconnected from the equipment. The error was enough that we wanted to resolve it, and we considered fitting a thermistor in the battery pack along with an ADC and EEPROM, that would log the battery temperature every 10 minutes, and store to EEPROM. The idea being that when the battery pack was re-connected to the equipment, then the EEPROM data would allow an estimation to be made of the self discharge since the battery was disconnected, by looking at what temperatures the battery pack had been sitting in (we expected it to be used in extremes, so the self-discharge varied a lot).
Fast-forward, and this new chip is amazing.
I was just recharging the battery again just now, and rather than guessing how long I needed to wait for it to complete charging, I quickly plugged in the USB-I2C adapter into the laptop, and the software told me precisely how long I needed to wait, and how much current the charger was supplying, in case I wished to terminate early (since the current is just a tiny amount in the last 15-20 minutes of charging).
The amount of data and the insight the chip provides is huge. It's providing visibility into the battery and connected equipment, which is really awesome.
Great writeup Shabaz. I really like the background you give on the technology and descriptions of how it works. Nice piece of kit too.