critical concern among battery users is knowing “readiness” or how much energy a battery has at its disposal at any given moment. While installing a fuel gauge on a diesel engine is simple, estimating the energy reserve of a battery is more complex — we still struggle to read state-of-charge (SoC) with reasonable accuracy. Even if SoC were precise, this information alone has limited benefits without knowing the capacity, the storage capability of a battery. Battery readiness, or state-of-function (SoF), must also include internal resistance, or the “size of pipe” for energy delivery. Figure 1 illustrates the bond between capacity and internal resistance on hand of a fluid-filled container that is being eroded as part of aging; the tap symbolizing the energy delivery.
Most batteries for critical missions feature a monitoring system, and stationary batteries were one of the first to receive supervision in the form of voltage check of individual cells. Some systems also include cell temperature and current measurement. Knowing the voltage drop of each cell at a given load provides cell resistance. Elevated resistance hints to cell failure caused by plate separation, corrosion and other malfunctions. Battery management systems (BMS) are also used in medical equipment, military devices, as well as the electric vehicle.
Although BMS serves an important role in supervising of batteries, such systems often falls short of expectations and here is why. The BMS device is matched to a new battery and does not adjust well to aging. As the battery gets older, the accuracy goes down and in extreme cases the data becomes meaningless. Most BMS also lack bandwidth in that they only reveal anomalies once the battery performance has dropped to 70 percent. The all-important 70–100 percent operating range is difficult to gauge and the BMS gives the battery a good bill-of-health. This prevents end-of-life prediction in that the operator must wait for the battery to show signs of wear before making a judgment. These shortcomings are not an oversight by the manufacturers, and engineers are trying to overcome them. The problem boils down to technology, or the lack thereof. Over-expectation is common and the user is stunned when stranded with a dead battery. Let’s look how current systems work and examine new technologies.
The most simplistic method to determine end-of-battery-life is by applying a date stamp or observing cycle count. While this may work for military and medical instruments, such a routine is ill suited for commercial applications. A battery with less use has lower wear-and-tear than one in daily operation and to assure reliability of all batteries, the authorities may mandate that all batteries be replaced sooner. A system made to fit all sizes causes good batteries to be discarded too soon, leading to increased operational costs and environment concerns.
Laptops and other portable devices use coulomb counting for SoC readout. The theory goes back 250 years when Charles-Augustin de Coulomb first established the “Coulomb Rule.” Coulomb counting works on the principle of measuring in- and out-flowing current of a battery. If, for example, a battery is charged for one hour at one ampere, the same energy should be available on discharge, but this is not the case. Internal losses and inaccuracies in capturing current flow add to an unwanted tracking error that must be corrected with periodic calibrations.
Calibration occurs naturally when running the equipment down. A full discharge sets the discharge flag, and the subsequent recharge establishes the charge flag (Figure 2). These two markers allow the calculation of state-of-charge by estimating the distance between the flags.
Coulomb counting should be self-calibrating, but in real life a battery does not always get a full discharge at a steady current. The discharge may be in form of a sharp pulse that is difficult to capture. The battery may then be partially recharged and be stored at high temperature, causing elevated self-discharge that cannot be tracked. To correct the tracking error, a “smart battery” in use should be calibrated once every three months or after 40 partial discharge cycles. This can be done by a deliberate discharge of the equipment or externally with a battery analyzer. Avoid too many intentional deep discharges as this stresses the battery.
Fifty years ago, the Volkswagen Beetle had few battery problems. The only battery management was ensuring that the battery was being charged while driving. Onboard electronics for safety, convenience, comfort and pleasure have added to the demands of the battery in modern cars. For the accessories to function reliably, the battery state-of-charge must be known at all times. This is especially critical with start-stop technologies, a future requirement in European cars to improve fuel economy.
When the engine of a start-stop vehicle turns off at a stoplight, the battery continues to draw 25–50 amperes to feed the lights, ventilators, windshield wipers and other accessories. The battery must have enough charge to crank the engine when the traffic light changes; cranking requires a brief 350A. To reduce engine loading during acceleration, the BMS delays charging for about 10 seconds.
Modern cars are equipped with a battery sensor that measures voltage, current and temperature. Packaged in a small housing and embedded into the positive battery clamp, the electronic battery monitor (EBM) provides a SoC accuracy of about +/–15 percent on a new battery. As the battery ages, the EBM begins to drift and the accuracy drops to 20–30 percent. This can result in a false warning message and some garage mechanics disconnect the EBM on an aging battery to stop annoyances. Disabling the control system responsible for the start-stop function immobilizes engine stop and reduces the legal clean air requirement of the vehicle.
Voltage, current and temperature readings are insufficient to assess battery SoF; the all-important capacity is missing. Until capacity can be measured with confidence on-board of a vehicle, the EBM will not offer reliable battery information. Capacity is the leading health indicator that in most cases determines the end-of-battery-life. Imagine measuring the liquid in a container that is continuously shrinking in size. State-of-charge alone has limited benefit if the storage has shrunk from 100 to 20 percent and this change cannot be measured. Capacity fade may not affect engine cranking and the CCA can remain at a vigorous 70 percent to the end of battery life. Because of reduced energy storage, a low capacity battery charges quickly and has normal vital signs, but failure is imminent. A bi-annual capacity check as part of service can identify low capacity batteries. Battery testers that read capacity are becoming available at garages.
A typical start-stop vehicle goes through about 2,000 micro cycles per year. Test data obtained from automakers and the Cadex laboratories indicate that the battery capacity drops to approximately 60 percent in two years when in a start-stop configuration. The standard flooded lead acid is not robust enough for start-stop, and carmakers use a modified AGM (Absorbent Glass Mat) to attain longer life.
Automakers want to make sure that no driver gets stuck in traffic with a dead battery. To conserve energy when SoC is low, the BMS automatically turns unnecessary accessories off and the motor stays running at a stoplight. Even with this preventive measure, SoC can remain low when commuting in gridlock. Motor idling does not provide much charge and with essential accessories engaged, such as lights and windshield wipers, the net effect could be a small discharge.
Battery monitoring is also important in hybrid vehicles to optimize charge levels. The BMS prevents stressful overcharge above 80 percent and avoids deep discharges below 30 percent SoC. At low charge level, the internal combustion engine engages earlier and is left running for additional charge.
The driver of an electric vehicle (EV) expects similar accuracies on the energy reserve as is possible with a gasoline-powered car. Current technologies do not allow this and some EV drivers might get stuck with an empty battery when the fuel gauge still indicates reserve. Furthermore, the EV driver anticipates that a fully charged battery will travel the same distance, year after year. This is not possible and the range will decrease as the battery fades with age. Distances between charges will also be shorter than normal when driving in cold temperatures because of reduced battery performance.
Some lithium-ion batteries have a very flat discharge curve and the voltage method does not work well to provide SoC in the mid-range. An innovative new technology is being developed that measures battery SoC by magnetic susceptibility. Quantum magnetism (Q-Mag™) detects magnetic changes in the electrolyte and plates that correspond to state-of-charge. This provides accurate SoC detection in the critical 40-70 percent mid-section. More impotently, Q-Mag™ allows measuring SoC while the battery is being charged and is under load.
The lithium iron phosphate battery in Figure 3 shows a clear decrease in relative magnetic field units while discharging and an increase while charging, which relates to SoC. We see no rubber band effect that is typical with the voltage method in which the weight of discharge lowers the terminal voltage and the charge lifts it up. Q-Mag™ also permits improved full-charge detection; however, the system only works with cells in plastic, foil or aluminum enclosures. Ferrous metals inhibit the magnetic field.
Q-Mag™ also works with lead acid. This opens the door to monitor starter batteries in vehicles. Figure 4 illustrates the Q-Mag™ sensor installed in close proximity to the negative plate. Knowing the precise state-of-charge at any given moment optimizes charge methods and identifies battery deficiencies, including the end-of-battery-life with on-board capacity estimations.
Q-Mag™ is also a candidate to monitor stationary batteries. The sensing mechanism does not need to touch the electrical poles for voltage measurements and this poses an advantage for high-voltage batteries. Furthermore, Q-Mag™ can assist EVs by providing SoF accuracies not possible with conventional BMS. Q-Mag™ may one day assist in the consumer market to test batteries by magnetism. It is conceivable that one day an iPhone or iPad can be placed on a test mat, similar to a charging mat, and read battery SoC and performance.
Figure 1: Relationship of CCA and capacity of a starter battery The liquid represents capacity, the leading health indicator; the tap symbolizes energy delivery or CCA. While the energy delivery remains strong, the capacity diminishes with age. Courtesy Cadex |
Most batteries for critical missions feature a monitoring system, and stationary batteries were one of the first to receive supervision in the form of voltage check of individual cells. Some systems also include cell temperature and current measurement. Knowing the voltage drop of each cell at a given load provides cell resistance. Elevated resistance hints to cell failure caused by plate separation, corrosion and other malfunctions. Battery management systems (BMS) are also used in medical equipment, military devices, as well as the electric vehicle.
Although BMS serves an important role in supervising of batteries, such systems often falls short of expectations and here is why. The BMS device is matched to a new battery and does not adjust well to aging. As the battery gets older, the accuracy goes down and in extreme cases the data becomes meaningless. Most BMS also lack bandwidth in that they only reveal anomalies once the battery performance has dropped to 70 percent. The all-important 70–100 percent operating range is difficult to gauge and the BMS gives the battery a good bill-of-health. This prevents end-of-life prediction in that the operator must wait for the battery to show signs of wear before making a judgment. These shortcomings are not an oversight by the manufacturers, and engineers are trying to overcome them. The problem boils down to technology, or the lack thereof. Over-expectation is common and the user is stunned when stranded with a dead battery. Let’s look how current systems work and examine new technologies.
The most simplistic method to determine end-of-battery-life is by applying a date stamp or observing cycle count. While this may work for military and medical instruments, such a routine is ill suited for commercial applications. A battery with less use has lower wear-and-tear than one in daily operation and to assure reliability of all batteries, the authorities may mandate that all batteries be replaced sooner. A system made to fit all sizes causes good batteries to be discarded too soon, leading to increased operational costs and environment concerns.
Laptops and other portable devices use coulomb counting for SoC readout. The theory goes back 250 years when Charles-Augustin de Coulomb first established the “Coulomb Rule.” Coulomb counting works on the principle of measuring in- and out-flowing current of a battery. If, for example, a battery is charged for one hour at one ampere, the same energy should be available on discharge, but this is not the case. Internal losses and inaccuracies in capturing current flow add to an unwanted tracking error that must be corrected with periodic calibrations.
Calibration occurs naturally when running the equipment down. A full discharge sets the discharge flag, and the subsequent recharge establishes the charge flag (Figure 2). These two markers allow the calculation of state-of-charge by estimating the distance between the flags.
Figure 2: Discharge and charge flags Calibration occurs by applying a full charge, discharge and charge. This can be done in the equipment or externally with a battery analyzer as part of battery maintenance. Courtesy Cadex |
Coulomb counting should be self-calibrating, but in real life a battery does not always get a full discharge at a steady current. The discharge may be in form of a sharp pulse that is difficult to capture. The battery may then be partially recharged and be stored at high temperature, causing elevated self-discharge that cannot be tracked. To correct the tracking error, a “smart battery” in use should be calibrated once every three months or after 40 partial discharge cycles. This can be done by a deliberate discharge of the equipment or externally with a battery analyzer. Avoid too many intentional deep discharges as this stresses the battery.
Fifty years ago, the Volkswagen Beetle had few battery problems. The only battery management was ensuring that the battery was being charged while driving. Onboard electronics for safety, convenience, comfort and pleasure have added to the demands of the battery in modern cars. For the accessories to function reliably, the battery state-of-charge must be known at all times. This is especially critical with start-stop technologies, a future requirement in European cars to improve fuel economy.
When the engine of a start-stop vehicle turns off at a stoplight, the battery continues to draw 25–50 amperes to feed the lights, ventilators, windshield wipers and other accessories. The battery must have enough charge to crank the engine when the traffic light changes; cranking requires a brief 350A. To reduce engine loading during acceleration, the BMS delays charging for about 10 seconds.
Modern cars are equipped with a battery sensor that measures voltage, current and temperature. Packaged in a small housing and embedded into the positive battery clamp, the electronic battery monitor (EBM) provides a SoC accuracy of about +/–15 percent on a new battery. As the battery ages, the EBM begins to drift and the accuracy drops to 20–30 percent. This can result in a false warning message and some garage mechanics disconnect the EBM on an aging battery to stop annoyances. Disabling the control system responsible for the start-stop function immobilizes engine stop and reduces the legal clean air requirement of the vehicle.
Voltage, current and temperature readings are insufficient to assess battery SoF; the all-important capacity is missing. Until capacity can be measured with confidence on-board of a vehicle, the EBM will not offer reliable battery information. Capacity is the leading health indicator that in most cases determines the end-of-battery-life. Imagine measuring the liquid in a container that is continuously shrinking in size. State-of-charge alone has limited benefit if the storage has shrunk from 100 to 20 percent and this change cannot be measured. Capacity fade may not affect engine cranking and the CCA can remain at a vigorous 70 percent to the end of battery life. Because of reduced energy storage, a low capacity battery charges quickly and has normal vital signs, but failure is imminent. A bi-annual capacity check as part of service can identify low capacity batteries. Battery testers that read capacity are becoming available at garages.
A typical start-stop vehicle goes through about 2,000 micro cycles per year. Test data obtained from automakers and the Cadex laboratories indicate that the battery capacity drops to approximately 60 percent in two years when in a start-stop configuration. The standard flooded lead acid is not robust enough for start-stop, and carmakers use a modified AGM (Absorbent Glass Mat) to attain longer life.
Automakers want to make sure that no driver gets stuck in traffic with a dead battery. To conserve energy when SoC is low, the BMS automatically turns unnecessary accessories off and the motor stays running at a stoplight. Even with this preventive measure, SoC can remain low when commuting in gridlock. Motor idling does not provide much charge and with essential accessories engaged, such as lights and windshield wipers, the net effect could be a small discharge.
Battery monitoring is also important in hybrid vehicles to optimize charge levels. The BMS prevents stressful overcharge above 80 percent and avoids deep discharges below 30 percent SoC. At low charge level, the internal combustion engine engages earlier and is left running for additional charge.
The driver of an electric vehicle (EV) expects similar accuracies on the energy reserve as is possible with a gasoline-powered car. Current technologies do not allow this and some EV drivers might get stuck with an empty battery when the fuel gauge still indicates reserve. Furthermore, the EV driver anticipates that a fully charged battery will travel the same distance, year after year. This is not possible and the range will decrease as the battery fades with age. Distances between charges will also be shorter than normal when driving in cold temperatures because of reduced battery performance.
Some lithium-ion batteries have a very flat discharge curve and the voltage method does not work well to provide SoC in the mid-range. An innovative new technology is being developed that measures battery SoC by magnetic susceptibility. Quantum magnetism (Q-Mag™) detects magnetic changes in the electrolyte and plates that correspond to state-of-charge. This provides accurate SoC detection in the critical 40-70 percent mid-section. More impotently, Q-Mag™ allows measuring SoC while the battery is being charged and is under load.
The lithium iron phosphate battery in Figure 3 shows a clear decrease in relative magnetic field units while discharging and an increase while charging, which relates to SoC. We see no rubber band effect that is typical with the voltage method in which the weight of discharge lowers the terminal voltage and the charge lifts it up. Q-Mag™ also permits improved full-charge detection; however, the system only works with cells in plastic, foil or aluminum enclosures. Ferrous metals inhibit the magnetic field.
Figure 3: Magnetic field measurements of a lithium iron phosphate during charge and discharge Relative magnetic field units provide accurate state-of-charge of lithium- and lead-based batteries. Courtesy of Cadex (2011) |
Q-Mag™ also works with lead acid. This opens the door to monitor starter batteries in vehicles. Figure 4 illustrates the Q-Mag™ sensor installed in close proximity to the negative plate. Knowing the precise state-of-charge at any given moment optimizes charge methods and identifies battery deficiencies, including the end-of-battery-life with on-board capacity estimations.
Figure 4: Q-Mag™ sensor installed on the side of a starter battery The sensor measures the SoC of a battery by magnetic susceptibility. When discharging a lead acid battery, the negative plate changes from lead to lead sulfate. Lead sulfate has a different magnetic susceptibility than lead, which a magnetic sensor can measure. Courtesy of Cadex (2009) |
Q-Mag™ is also a candidate to monitor stationary batteries. The sensing mechanism does not need to touch the electrical poles for voltage measurements and this poses an advantage for high-voltage batteries. Furthermore, Q-Mag™ can assist EVs by providing SoF accuracies not possible with conventional BMS. Q-Mag™ may one day assist in the consumer market to test batteries by magnetism. It is conceivable that one day an iPhone or iPad can be placed on a test mat, similar to a charging mat, and read battery SoC and performance.
Finally I have found something which helped me. Appreciate it! lead acid batteries | lead acid battery
ReplyDelete