Electric vehicles have recently boasted impressive growth rates, more than doubling in market penetration every two years between 2014 and 2018. And batteries play a key role in EV performance and price. That’s why some companies are looking to new chemistries and battery technologies to sustain EV growth rates throughout the early 2020s.
Three recent developments suggest that executives are more than just hopeful. They are, in fact, already striking deals to acquire and commercialize new EV battery advances. And progress has been broad—the new developments concern the three main electrical components of a battery: its cathode, electrolyte, and anode.
TESLA’S BIG BETS Analysts think Tesla’s upcoming annual Battery Day (the company hadn’t yet set a date at press time) will hold special significance. Maria Chavez of Navigant Research in Boulder, Colo., expects to hear about at least three big advancements.
The first one (which Reuters reported in February) is that Tesla will develop batteries with cathodes made from lithium iron phosphate for its Model 3s. These LFP batteries—with “F” standing for “Fe,” the chemical symbol for iron—are reportedly free of cobalt, which is expensive and often mined using unethical practices. LFP batteries also have higher charge and discharge rates and longer lifetimes than conventional lithium-ion cells. “The downside is that they’re not very energy dense,” says Chavez.
To combat that, Tesla will reportedly switch from standard cylindrical cells to prism-shaped cells—the second bit of news Chavez expects to hear about. Stacking prisms versus cylinders would allow Tesla to fit more batteries into a given space.
A third development, Chavez says, may concern Tesla’s recent acquisition, Maxwell Technologies. Before being bought by Tesla in May of 2019, Maxwell specialized in making supercapacitors. Supercapacitors, which are essentially charged metal plates with proprietary materials in between, boost a device’s charge capacity and performance.
Supercapacitors are famous for pumping electrons into and out of a circuit at blindingly fast speeds. So an EV power train with a supercapacitor could quickly access stores of energy for instant acceleration and other power-hungry functions. On the flip side, the supercapacitor could also rapidly store incoming charge to be metered out to the lithium battery over longer stretches of time—which could both speed up quick charging and possibly extend battery life.
So could blending supercapacitors, prismatic cells, and lithium iron phosphate chemistry provide an outsize boost for Tesla’s EV performance specs? “The combination of all three things basically creates a battery that’s energy dense, low cost, faster-to-charge, and cobalt-free—which is the promise that Tesla has been making for a while now,” Chavez said.
SOLID-STATE DEALS Meanwhile, other companies are focused on improving both safety and performance of the flammable liquid electrolyte in conventional lithium batteries. In February, Mercedes-Benz announced a partnership with the Canadian utility Hydro-Québec to develop next-generation lithium batteries with a solid and nonflammable electrolyte. And a month prior, the Canadian utility announced a separate partnership with the University of Texas at Austin and lithium-ion battery pioneer John Goodenough, to commercialize a solid-state battery with a glass electrolyte.
“Hydro-Québec is the pioneer of solid-state batteries,” said Karim Zaghib, general director of the utility’s Center of Excellence in Transportation Electrification and Energy Storage. “We started doing research and development in [lithium] solid-state batteries...in 1995.”
Although Zaghib cannot disclose the specific electrolytes his lab will be working with Mercedes to develop, he says the utility is building on a track record of successful battery technology rollouts with companies including A123 Systems in the United States, Murata Manufacturing in Japan, and Blue Solutions in Canada.
STARTUP SURPRISE Lastly, Echion Technologies, a startup based in Cambridge, England, said in February that it had developed a new anode for high-capacity lithium batteries that could charge in just 6 minutes. (Not to be outdone, a team of researchers in Korea announced that same month that its own silicon anode would charge to 80 percent in 5 minutes.)
Echion CEO Jean de la Verpilliere—a former engineering Ph.D. student at the nearby University of Cambridge—says Echion’s proprietary “mixed niobium oxide” anode is compatible with conventional cathode and electrolyte technologies.
“That’s key to our business model, to be ‘drop-in,’ ” says de la Verpilliere, who employs several former Cambridge students and staff. “We want to bring innovation to anodes. But then we will be compatible with everything else in the battery.”
In the end, the winning combination for next-generation batteries may well include one or more breakthroughs from each category—cathode, anode, and electrolyte.
This article appears in the April 2020 print issue as “EV Batteries Shift Into High Gear.”
Neolix, a maker of urban robo-delivery trucks, made an interesting claim recently. The Beijing-based company said orders for its self-driving delivery vehicles were soaring because the coronavirus epidemic had both cleared the roads of cars and opened the eyes of customers to the advantages of driverlessness. The idea is that when the epidemic is over, the new habits may well persist.
Neolix last week told Automotive News it had booked 200 orders in the past two months after having sold just 159 in the eight months before. And on 11 March, the company confirmed that it had raised US $29 million in February to fund mass production.
Of course, this flurry of activity could merely be coincidental to the epidemic, but Tallis Liu, the company’s manager of business development, maintains that it reflects changing attitudes in a time of plague.
“We’ve seen a rise in both acceptance and demand both from the general public and from the governmental institutions,” he tells IEEE Spectrum. The sight of delivery bots on the streets of Beijing is “educating the market” about “mobility as a service” and on “how it will impact people’s day-to-day lives during and after the outbreak.”
During the epidemic, Neolix has deployed 50 vehicles in 10 major cities in China to do mobile delivery and also disinfection service. Liu says that many of the routes were chosen because they include public roads that the lockdown on movement has left relatively empty.
The company’s factory has a production capacity of 10,000 units a year, and most of the factory staff has returned to their positions, Liu adds. “Having said that, we are indeed facing some delays from our suppliers given the ongoing situation.”
Neolix’s deliverybots are adorable—a term this site once used to describe a strangely similar-looking rival bot from the U.S. firm Nuro. The bots are the size of a small car, and they’re each equipped with cameras, three 16-channel lidar laser sensors, and one single-channel lidar. The low-speed version also has 14 ultrasonic short-range sensors; on the high-speed version, the ultrasonic sensors are supplanted by radars.
If self-driving technology benefits from the continued restrictions on movement in China and around the world, it wouldn’t be the first time that necessity had been the mother of invention. An intriguing example is furnished by a mere two-day worker’s strike on the London Underground in 2014. Many commuters, forced to find alternatives, ended up sticking with those workarounds even after Underground service resumed, according to a 2015 analysis by three British economists.
One of the researchers, Tim Willems of Oxford University, tells Spectrum that disruptions can induce permanent changes when three conditions are met. First, “decision makers are lulled into habits and have not been able to achieve their optimum (close to our Tube strike example).” Second, “there are coordination failures that make it irrational for any one decision maker to deviate from the status quo individually” and a disruption “forces everybody away from the status quo at the same time.” And third, the reluctance to pay the fixed costs required to set up a new way of doing things can be overcome under crisis conditions.
By that logic, many workers sent home for months on end to telecommute will stay on their porches or in their pajamas long after the all-clear signal has sounded. And they will vastly accelerate the move to online shopping, with package delivery of both the human and the nonhuman kind.
On Monday, New York City’s mayor, Bill de Blasio, said he was suspending his long-running campaign against e-bikes. “We are suspending that enforcement for the duration of this crisis,” he said. And perhaps forever.
As a transportation technology journalist, I’ve ridden in a lot of self-driving cars, both with and without safety drivers. A key part of the experience has always been a laptop or screen showing a visualization of other road users and pedestrians, using data from one or more laser-ranging lidar sensors.
Ghostly three-dimensional shapes made of shimmering point clouds appear at the edge of the screen, and are often immediately recognizable as cars, trucks, and people.
At first glance, the screen in Echodyne’s Ford Flex SUV looks like a lidar visualization gone wrong. As we explore the suburban streets of Kirkland, Washington, blurry points and smeary lines move across the display, changing color as they go. They bear little resemblance to the vehicles and cyclists I can see out of the window.
That’s because this car is not using lidar to build up a picture of its surroundings, but a new cognitive radar system called EchoDrive, developed by Echodyne, a Bill Gates-funded startup. Ironically, the fact that I cannot immediately interpret the visualization is because Echodyne’s radar functions more like human vision—jumping around to focus on what’s important—than a lidar’s global view.
Time for some Sensors 101. Lidars work by using mirrors to direct laser pulses over a wide field of view, allowing a vehicle to detect hazards through a full 360 degrees. As laser light has very short wavelengths, the spatial resolution of lidar is also excellent. However, lidar performance degrades in rain, fog, or snow, and the best units are still very expensive.
Radars are cheap, unaffected by the weather, and can work over long distances. But they suffer from two big problems. With longer wavelengths, radars can struggle to resolve small features, especially at long range. And traditional radars are not easy to direct over wide scenes without bulky mechanical antennas, like the spinning radars on ships.
Thus, existing automotive radars generally have narrow, fixed fields of view, and little ability to discriminate what they are detecting. (Radar has been implicated in several crashes involving Tesla’s Autopilot system).
Echodyne’s innovation is to use scanning arrays, based on metamaterials whose refractive index can be tuned electronically. The radar beam can then be steered across a wide field of view to scan for large obstacles in the road ahead, or focused on a small area to help identify what it has detected. Just being able to steer and task the radar can give an order of magnitude more sensitivity at long range than existing systems, according to Tom Driscoll, Echodyne’s founder and CTO.
“The basic concept is that by pushing and pulling a four-dimensional data cube of azimuth, elevation, range, and Doppler around, you can allocate the overall bound resources of the radar where and when you need them,” he says.
DARPA calls this concept cognitive radar, and has been working on developing it for at least a decade, mostly using high performance (and extremely expensive) phased array radars.
Driscoll makes an analogy with human vision. Although we feel as though we have good vision over a wide field of view in front of us, we actually only have good resolution in the very center of our eye, with our brain focusing that attention to the periphery as needed.
Similarly, the EchoDrive radar can be tasked by the car’s computer to interrogate different parts of the scene. “On an open road, we might have beams tracking identified cars in front of us,” says Driscoll. “Then you could imagine coming to a T-intersection and spending more time looking left, just like a human driver does. When you reach a crosswalk, you’ll use interrogation modes specifically designed to check whether there are pedestrians.”
EchoDrive has another trick up its sleeve. As well as reporting the azimuth (horizontal angle), elevation, and range of an object like a lidar does, radars can also detect its relative velocity, because the frequency of returning signals are Doppler-shifted. Radar pulses might not have the spatial resolution of lidar, but micro-Doppler radar spectra can identify objects by revealing distinctive features like a runner’s moving arms, or the spinning of a bicycle wheel.
As our drive (which was under human control) progressed, I could begin to make out features from the visualization. Because it showed only narrow vertical slices of the scene ahead, a bridge might appear as dots (support pillars), a band (the bridge roadway), or nothing at all (for a slice looking too far up). Pedestrians and even construction cones were visible, and everything was color-coded—red for receding, blue for approaching.
A typical self-driving car might have between three and six such cognitive radar units, working alongside traditional long-range radars, cameras, and lidars, and drawing on a database of high-definition maps. Although cognitive radar might be superior to lidar in some situations, Driscoll sees it as an addition rather than a replacement for the laser units.
“My view is that these vehicles at Level 4 and 5 [capable of driving without human oversight] should have every high-end sensor you can possibly rationalize, at least until the problem’s solved,” he says. “Sensor fusion is about having enough overlap between sensors that you can stitch them together.”
Echodyne has already supplied prototype EchoDrive radars to several companies working on self-driving technologies, and Driscoll believes that a production version should sell for less than US $1,000. That’s more expensive than today’s clumsy collision radars but much cheaper than state-of-the-art lidars.
Laser companies are also experimenting with cheaper lidars enabled by metamaterials, likely destined for Level 2 and 3 driver assistance systems. Even if practical and reliable self-driving cars remain science fiction for years to come, smarter, cheaper sensors like these could make driving safer for everyone in the meantime.
A lot of people in the auto industry talked for way too long about the imminent advent of fully self-driving cars.
In 2013, Carlos Ghosn, now very much the ex-chairman of Nissan, said it would happen in seven years. In 2016, Elon Musk, then chairman of Tesla, implied his cars could basically do it already. In 2017 and right through early 2019 GM Cruise talked 2019. And Waymo, the company with the most to show for its efforts so far, is speaking in more measured terms than it used just a year or two ago.
It’s all making Gill Pratt, CEO of the Toyota Research Institute in California, look rather prescient. A veteran roboticist who joined Toyota in 2015 with the task of developing robocars, Pratt from the beginning emphasized just how hard the task would be and how important it was to aim for intermediate goals—notably by making a car that could help drivers now, not merely replace them at some distant date.
That helpmate, called Guardian, is set to use a range of active safety features to coach a driver and, in the worst cases, to save him from his own mistakes. The more ambitious Chauffeur will one day really drive itself, though in a constrained operating environment. The constraints on the current iteration will be revealed at the first demonstration at this year’s Olympic games in Tokyo; they will certainly involve limits to how far afield and how fast the car may go.
Earlier this week, at TRI’s office in Palo Alto, Calif., Pratt and his colleagues gave Spectrum a walkaround look at the latest version of the Chauffeur, the P4; it’s a Lexus with a package of sensors neatly merging with the roof. Inside are two lidars from Luminar, a stereocamera, a mono-camera (just to zero in on traffic signs), and radar. At the car’s front and corners are small Velodyne lidars, hidden behind a grill or folded smoothly into small protuberances. Nothing more could be glimpsed, not even the electronics that no doubt filled the trunk.
Pratt and his colleagues had a lot to say on the promises and pitfalls of self-driving technology. The easiest to excerpt is their view on the difficulty of the problem.
“There isn’t anything that’s telling us it can’t be done; I should be very clear on that,” Pratt says. “Just because we don’t know how to do it doesn’t mean it can’t be done.”
That said, though, he notes that early successes (using deep neural networks to process vast amounts of data) led researchers to optimism. In describing that optimism, he does not object to the phrase “irrational exuberance,” made famous during the 1990s dot-com bubble.
It turned out that the early successes came in those fields where deep learning, as it’s known, was most effective, like artificial vision and other aspects of perception. Computers, long held to be particularly bad at pattern recognition, were suddenly shown to be particularly good at it—even better, in some cases, than human beings.
“The irrational exuberance came from looking at the slope of the [graph] and seeing the seemingly miraculous improvement deep learning had given us,” Pratt says. “Everyone was surprised, including the people who developed it, that suddenly, if you threw enough data and enough computing at it, the performance would get so good. It was then easy to say that because we were surprised just now, it must mean we’re going to continue to be surprised in the next couple of years.”
The mindset was one of permanent revolution: The difficult, we do immediately; the impossible just takes a little longer.
Then came the slow realization that AI not only had to perceive the world—a nontrivial problem, even now—but also to make predictions, typically about human behavior. That problem is more than nontrivial. It is nearly intractable.
Of course, you can always use deep learning to do whatever it does best, and then use expert systems to handle the rest. Such systems use logical rules, input by actual experts, to handle whatever problems come up. That method also enables engineers to tweak the system—an option that the black box of deep learning doesn’t allow.
Putting deep learning and expert systems together does help, says Pratt. “But not nearly enough.”
Day-to-day improvements will continue no matter what new tools become available to AI researchers, says Wolfram Burgard, Toyota’s vice president for automated driving technology.
“We are now in the age of deep learning,” he says. “We don’t know what will come after—it could be a rebirth of an old technology that suddenly outperforms what we saw before. We are still in a phase where we are making progress with existing techniques, but the gradient isn’t as steep as it was a few years ago. It is getting more difficult.”
With global consumers tethered to their smartphones, automakers realize their cars need to deliver a similar infotainment experience—even if that means sharing the ride with Google and other tech giants. The long-awaited Android Automotive OS system debuts in a few months in the 2020 Polestar 2, and will ultimately power millions of cars from General Motors, Fiat Chrysler Automobiles, and the Renault-Nissan-Mitsubishi alliance.
If you’re not familiar, Polestar is the new, electric and high-performance division of Sweden’s Volvo Cars and its China-based parent Geely Auto Group. And the Polestar Precept, an electric concept car unveiled online on Tuesday, after the coronavirus forced the cancellation of the Geneva International Motor Show, suggests a bright future for both Polestar design and Android OS.
But it’s the Polestar 2 that will debut the open-source system in showrooms, with its Android-powered navigation, apps, voice commands, and screen prods. The alluring fastback sedan, an electric rival to Tesla’s Model 3, combines 408 all-wheel-drive horsepower with a roughly 450-kilometer range for a US $63,000 starting price.
While Polestar will score the showroom first, General Motors made waves in September when it announced that Android OS will underpin the infotainment units of Chevrolet, Buick, Cadillac, and GMC models, beginning in the 2021 calendar year. Fiat Chrysler Automobiles and the Renault-Nissan-Mitsubishi alliance are also onboard, with each automaker customizing the look and feel of systems to suit their own design needs. With GM alone holding about 17 percent of the U.S. car market, that’s a huge win for Google, according to Brian Rhodes, research and analysis manager for connected cars at IHS Markit.
“Volvo is the first domino to fall,” Rhodes said. “And for automakers, Google and consumers, it’s a big milestone in where infotainment is going.”
The Polestar 2 shows what this world, and its human-machine interface, will look like. Virtually every user control is housed on an 28-centimeter, tablet-style touchscreen in the sleekly minimal interior. (Fortunately, there’s still an analog audio knob, something that many drivers insist upon).
Owners will be able to access embedded features including Google Maps, Google Assistant, and Google Play Store, even when their phone is switched off. A phone-based digital key tailors the environment to individual users as they approach and unlock the car, adjusting settings for seats, mirrors, climate, and entertainment. Video streaming from popular apps and services will also be available, but only when the car is parked or charging.
Google Assistant, with its rapidly-expanding repertoire of languages, local accents, and conversational speech patterns, should improve upon the cumbersome, pre-set voice commands that many consumers despised—or never bothered to learn—in automakers’ in-house systems. Now, drivers can use it to connect with Google-connected smart devices in the home. And where old-school systems can no longer keep pace with rapid technological change—CD player, anyone?—Android OS units can be updated over-the-air, keeping infotainment and apps current as the car ages.
The Polestar Precept, for its part, offers a glimpse of future possibilities. Space under the hood, once given to an internal-combustion engine and radiators, houses dual radar sensors and a high-def camera behind a transparent panel and LED headlamps shaped like “Thor’s Hammer”, the latter carried over from Volvo. A lidar pod atop the glass roof, and wide-angle rear camera, complete the sensor suite for safety and driver-assistance functions. Camera-based units replace traditional side mirrors.
The Precept also previews the brand’s next-generation, Android-powered human-machine interface, including a portrait-oriented, 38-centimeter center touchscreen and 32-centimeter driver’s display. To address potential information overload and avoid driver distraction, the Precept’s interface features eye-tracking sensors that can illuminate screens or adjust content when drivers glance at them, then dim when they look away. Proximity sensors, already familiar in Cadillacs and some other luxury cars, call up relevant screen information as a hand approaches.
Automakers, which throughout history have farmed out some component design and manufacture to suppliers—including Bosch electronics and ZF transmissions—have realized that their embedded infotainment systems can no longer keep pace with Silicon Valley’s sophisticated best. Instead, buyers are demanding ecosystems that mimic apps they already use on their smartphones.
That imperative has allowed Apple and Google to pry their way into the automotive space. Apple CarPlay and Android Auto let users beam smartphone-based navigation and other apps to a car’s central touchscreen. Apple CarPlay and Android Auto are now available on more than 400 car models around the world, but still require a plugged-in, charged-up phone to operate.
Not to be left behind, Amazon is partnering with Toyota, Audi, Ford, and other automakers to bring Alexa to the dashboard. Microsoft is co-developing infotainment with Ford, Hyundai, and Kia. Rhodes says that tech giants’ foray into cars certainly raises issues over privacy, and who will control or monetize data.
“From Google’s perspective, it’s about collecting data, period,” Rhodes said. “It’s the strategy they entered automotive with, and the strategy they continue to pursue.”
Automakers, Rhodes said, recognize the need to wall off some of their “mission critical” data, or protect customers from unwanted intrusions. GM has said it will share certain data with Google, but hasn’t gotten into specifics.
Polestar spokesman JP Canton said that especially sensitive “black box” data, such as vehicle speed, battery level, driving range, and charging rate will be used by Google or app developers only with a user’s permission, which an owner can revoke at any time. But Canton notes, for example, that sharing a car’s battery levels with Google Maps will help a driver get accurate route calculations and recommendations for charging stops along the route.
As ever, Rhodes said, there will be a trade-off between tech goodies that consumers demand, and privacy they’re willing to give up.
“We’ve already seen Google take a more-flexible approach in negotiating with automakers,” Rhodes says. “They know they want access to the car, customers, and data, so they’re willing to adapt to get it.”
This story was updated on 4 March 2020.