Intel Claims Victory in Mobile Performance, Dodges Battery Life Benchmarks
Another day, another presentation from Intel featuring benchmarks of questionable origins. Around the time Tiger Lake based 11th gen CPUs were announced, Intel made several claims about why their new CPUs were so much better than AMD’s, but one of the things that stuck out the most to me was their benchmarks showing how Intel CPUs were far faster compared to AMD CPUs when you benchmarked laptops unplugged. Intel today released a presentation on that theme.
So, getting to Intel’s presentation, they tested 5 Ryzen 4000 laptops and 5 11th Gen laptops, with everything left at default settings except for screen brightness which was tuned to 200 nits. Nothing really strange there, except that is all they really give you for test methodology, which is fairly lacking. After test methodology, this is the very first thing they show:
The blue bar represents the average score the Intel and AMD laptops achieve in the MobileMark 18 benchmark, showing Intel more than 50% in the lead. Additionally, it shows battery life in minutes, where AMD is given the win by around an hour. Most of you have probably never heard of MobileMark 18, and I hadn’t either, and it turns out that this is a benchmark from BapCo, one of Intel’s satellite benchmark companies. So Intel basically gave themselves this 50% win. Furthermore, battery life is not measured with an actual benchmark, but is based on nothing more than the battery size, which is obviously a very misleading way to show battery life.
For the next 7 slides Intel is just showing benchmarks, with the results for each laptop. In all of these benchmarks, Intel is winning whether the laptop is plugged in or not, but when it’s not plugged in, Intel is winning by a massive margin, by around 50% or so. Of these 7 benchmarks, only 4 of them are actually showing real world performance, which is apparently Intel’s shtick these days… except when the fake benchmarks show Intel winning? And of those 3 fake benchmarks (or you can just call them synthetic if you’re a normal person), 1 of them is also from BapCo, another is from the infamous Principled Technologies, and the last is PCMark 10, which is a widely used benchmark, but I wouldn’t describe it as one of the best. Furthermore, Intel only showed some of the PCMark 10 scores, presumably the ones that show AMD in the worst light possible.
I actually went through all 7 of these slides and estimated how much faster Intel was in every single one. You will see in the table below why Intel decided to once again abandon its “real world benchmarks” mantra.
Intel’s Margin of Victory
PPT to PDF
Excel to Word
Word to PDF
Not Plugged in
Disclaimer: estimation, raw data not provided by Intel
Wow, twice as fast as AMD in PCMark 10! And almost twice as fast in the Intel sponsored benchmarks too! No wonder they throw “real world benchmarks” out the window so often when they can get results like these. If we just look at the “real world benchmarks”, we find that Intel’s about 35% faster plugged in, 50% faster not plugged in. Then, Intel whips out this hilarious slide:
It really looks like Intel is accusing AMD and Maxon of colluding in order to make Ryzen CPUs look much better than they really are. Except, AMD isn’t the one touting performance on battery, Intel is, which should make you wonder why Intel is even saying this in the first place. This is clear projection from Intel; 3 of the 8 benchmarks Intel showed were from their satellites and it stands to reason that if Intel is emphasizing performance on battery so much then the laptops they benchmarked were likely chosen because they are tuned specifically for performance on battery. I’m not even sure if Intel’s marketing team even understood the hypocrisy here.
There’s a few more slides showing how AMD CPUs suck at boosting (without providing any comparison to Tiger Lake CPUs, of course) and then they end on this summarizing slide:
I think the most illuminating part of the entire presentation is within the second bullet point: “optimal balance of responsiveness and battery life”. Let’s bring ourselves back to the very first benchmark slide with the battery size comparison and MobileMark 18. We know that when Intel wins, they exaggerate how much they win by. When Intel loses, they likewise understate how much they lose by. The only time Intel ever admitted they lost was in battery life, and there they did not use a benchmark, but estimated battery life based solely on the size.
The one thing Intel is not showing you here is real battery life, and that is almost certainly because battery life is very, very poor on these Intel laptops when running these workloads. I don’t have access to an 11th Gen laptop to test my theory, but it’s a simple truth that more performance means more power consumption. AMD laptops, when on battery, are performing worse because they are consuming less power intentionally in order to increase battery life. If Intel laptops are performing about the same whether or not they’re plugged in, that means power consumption is about the same, and that would probably lead to poor battery life, perhaps far worse than Intel is claiming. In conclusion, it actually doesn’t even matter that Intel is doing really well in these benchmarks because they’re draining the battery to do it. I think the vast majority of mobile users would rather have more battery life than more performance.
So, yet another misleading, poorly made presentation from Intel. Sometimes I wonder if the marketing from AMD and Nvidia could possibly be any worse, and then Intel helpfully reminds me that yes, they can. At least Intel can say they’re proving something… good for them I guess.
Liked it? Take a second to support Matthew Connatser on Patreon!