Here’s why you should take EPA range estimates for EVs with a pinch of salt
Like a well-known rugby shirt wearing, floppy haired car critic, when I hear about new cars, I get “the fizz.” I dive straight into the stats, the graphic renders, and head to forums to find out what others are saying about the new whip.
When it comes to EVs, the first stat I look for is the Environmental Protection Agency’s (EPA) estimated range figure. In a world of electric vehicle Top Trumps, this is the number that beats all else. Forget naught to 60 times, it’s all about how far an EV can go on a single charge.
Except, it’s painfully clear that EPA figures can’t really be trusted. Just like emissions tests in combustion engine cars, what we get in the real-world is proving to be quite different from what dyno-based tests suggest.
The latest tests from Edmunds, a US-based car buying advice website, are a perfect example. You can read their full report here, but here’s the one thing you need to know: in real-world tests electric cars virtually never achieve their EPA estimated range.
Before you go thinking that Edmunds is some harbinger of range anxiety — a necromancer of negativity — it’s important to know that while some EVs fell way short of their EPA range estimates, others far surpassed them.
Here are a few examples:
1. The Porsche Taycan 4S: 203 miles (EPA range) v 323 miles (Edmunds’ tests) That’s over 59% further.
2. A 2019 Hyundai Kona Electric: 258 miles (EPA) v 315 miles (Edmunds’ tests).
3. Even the MINI Cooper SE: 110 miles (EPA) v 150 miles (Edmunds’ tests).
Not every car test faired this well. In fact, one brand in particular continually fell short of estimated ranges: Tesla.
For example, a Long Range Model X missed its expected range by 10%, a Model 3 Performance fell 17% short of its EPA range, and a Model Y performance stopped nearly 10% short of its suggested EPA range.
So what’s going on?
There are a few things that could affect the numbers Edmunds are returning. Even though the company has a set protocol, it’s impossible to drive every single EV tested in exactly the same way, under exactly the same conditions.
As Edmunds shows, there was a 12 degree difference in ambient temperatures when testing. One persons’ real-world tests won’t translate to everyone else.
What’s more, Edmunds’ tests place a greater emphasis on city driving than the EPA protocols do, with regenerative braking this can lead to higher ranges due to the stop/start nature of town and city driving.
It’s possible that Edmunds’ testing protocol simply didn’t suit the Teslas they were driving. But there are other things, specific to Tesla, to consider.
Another reason why Tesla falls short is because of how it adjusts its EPA estimates to get the biggest number possible, most likely for marketing reasons.
As this article from Car and Driver explains, the range figure the EPA comes up with isn’t as simple as just testing the car on a dyno and seeing how long it can last.
EPA tests also let manufacturers apply a voluntary range reduction, it also applies an adjustment factor which can vary from vehicle to vehicle, and brand to brand. There are also ways to game the system which are too complex to mention here, head over to Car and Driver for more on those.
The crux of the matter, though, is that Tesla has been in the game longer than most, and it knows how to get the absolute most out of its vehicles for EPA tests. So much so, its EPA range figures often far exceed what drivers get in the real world.
So if we can learn anything from all of this, it’s to take EPA range estimates with a handy helping of salt. Depending on what EV you buy, how you drive it, and in what conditions its driven, you might get more or less range than the EPA quotes. Consider EPA figures a ballpark concept at best.
SHIFT is brought to you by Polestar. It’s time to accelerate the shift to sustainable mobility. That is why Polestar combines electric driving with cutting-edge design and thrilling performance. Find out how.
Published February 12, 2021 — 08:57 UTC