System Performance

One aspect Google Pixel devices have always excelled at is performance. With every generation, Google had opted to customise the BSP stack and improve on Qualcomm’s mechanisms to be able to extract as much performance out of the SoC as possible. In recent years these customisations haven’t been quite as evident as QC’s schedulers became more complex and also more mature. The Pixel 4 again makes use of Qualcomm’s scheduler mechanisms instead of Google’s own Android Common Kernel. The Pixel 4 also arrives with Android Q which is one of the very few devices in our testbench which comes with the new OS version.

We’re testing the Pixel 4 at three refresh rate settings: the default 60Hz mode, the automatic 90Hz mode, and the forced 90Hz mode. As with the OnePlus 7 Pro earlier in the year, we’re expecting to measure differences between the different display modes.

PCMark Work 2.0 - Web Browsing 2.0

Starting off with the web browsing test, we’re seeing the Pixel 4 XL perform quite averagely. The odd thing here is that it’s showcasing worse performance and scaling than the Pixel 3 last year in all but the forced 90Hz mode. It’s also interesting to see how the forced 90Hz mode is able to post an advantage over the regular 90Hz mode even though the content of the benchmark doesn’t contain anything in particular that would have the automatic mode trigger to 60Hz.

PCMark Work 2.0 - Video Editing

In the video editing test, which isn’t all that significant in terms of its results, we do however see the differences between the 60 and 90Hz modes. Again, it’s odd to see the 60Hz mode perform that much worse than the Pixel 3 in this test, pointing out to more conservative scaling of the little CPU cores.

PCMark Work 2.0 - Writing 2.0

In the Writing test which is the most important sub-test of PCMark and has heavier workloads, we see the Pixel 4 perform very well and is in line with the better Snapdragon 855 devices out there.

PCMark Work 2.0 - Photo Editing 2.0

The Photo Editing scores of the Pixel 4 are also top notch and the best Snapdragon 855 device we have at hand.

PCMark Work 2.0 - Data Manipulation

The data manipulation test is another odd one that I can’t really explain it performs better on the forced 90Hz mode over than the automatic 90Hz mode.

PCMark Work 2.0 - Performance

Finally, the Pixel 4 ends up high in the ranks in PCMark, really only trailing the Mate 30 Pro.

Speedometer 2.0 - OS WebView JetStream 2 - OS Webview WebXPRT 3 - OS WebView

In the web benchmarks, the Pixel 4 performs quite average to actually quite bad, compared to what we’ve seen from other S855 phones. I’m really not sure why the degradation takes place, I’ll have to investigate this more once I have another S855 with Android Q.

Performance Conclusion

Overall, performance of the Pixel 4 is excellent, as expected. The big talking point here isn’t really the SoC or Google’s software, but rather the 90Hz screen of the phone. It really augments the experienced performance of the phone, making it stand out above other 60Hz phones this year.

That being said, unlike last year, I can’t say that the Pixel 4 is amongst the snappiest devices this year as that title was already taken by the new Huawei Mate 30 Pro with the newer generation Kirin 990. Unfortunately for Google, performance of the Pixel 4 will be a rather short-lived selling point as I expect the competition (which don’t already have the feature) to catch up with high refresh screens, and also surpass the Pixel as the new generation Snapdragon SoCs are just a month away from launch.

Introduction & Design GPU Performance
Comments Locked

159 Comments

View All Comments

  • Andrei Frumusanu - Friday, November 8, 2019 - link

    It goes up to 600 nits in HDR video content, I don't currently have a good methodology to measure that.
  • s.yu - Friday, November 8, 2019 - link

    Daylight:
    I now realize that the crushed black issue is probably not really an issue, it's an attempt to hide the raised noise floor resulting from stacking underexposed shots. Take the poorly lit loading area of that building in sample 3 for example, Pixel 3 displays a similar depth into the shadows to iP11P and S10+, but Pixel 4 goes further at the cost of a lot more noise, while Mate30P simply has more shadow DR because of the larger sensor and that it doensn't use such a stack.
    Also I don't believe the difference in shadow mapping(darker, but far from pitch black leaves, vs. lighter leaves) is evidence of raised DR. It's only raised DR if deeper shadows are revealed at the cost of no more noise, or if the same shadow is now less noisy, but neither is true as evident in the foreground shadow of sample 4, Pixel 4's shadow is, as I noticed at GSMA without even a direct comparison, more noisy than Pixel 3's. Not only noisy but flat-looking, an overdone HDR effect. It's so bad there's blotching, under the roof of the building to the left in sample 4. The costs seem to notably outweigh the gains, or Pixel 4 used an even faster shutter for this specific shot to recover more highlight, resulting in worse shadows. In this shot iP11P's hybrid bracketing approach utilizing longer exposure time for some of the frames shows its strength and delivers the most solid shadows among the different stacking approaches. So if Apple were to lift shadows to the same extent as Pixel 4 then we might have been able to see deeper shadows back in sample 3.
    Come to think of it, Pixel 4's noisy shadows were even notable in the official samples leaked before launch. I thought it was due to pre-retail firmware but I was too optimistic.
    Look's like Apple's is the best overall but Pixel 3 has the most stable detail retention. Samsung still lacks detail as it has been for quite some time but at least DR wise it generally matches Apple. Huawei's latest Mate30P greatly dials down both sharpening and NR resulting in a surprising number of keepers, only sometimes sharpening would seem too low yielding in what's now not waxyness but a thin haziness, or maybe the sharpening threshold is too high.
    Low light:
    In the first sample Pixel 4 obviously added more NR, not a good choice as the noise level of Pixel 3 in this image is completely manageable, more NR only smeared the output. iP11P seems between the two Pixels in a number of metrics. Huawei's night mode apart from somewhat aggregated texture intensely suppresses highlights at the cost of various artifacts present since P20P, can't say I ever liked it.
    Sample 2: Again Pixel 4 is rather taking the shove the noise floor in your face approach rather than actually yielding more shadow DR, just look at the blotchy sky compared to its predecessor, however it's able to better recover highlights from the office windows.
    In the castle sample Pixel 4 definitely does something right, it's not only much cleaner but the blotchy sky is gone. Also I suspect less falloff from the new lens, which yields better corners. Mate30P's technically very large UWA is better than the Samsung's tiny sensor yet a lot worse than the main, perplexing as this sensor's output stacked should still be competitive, also the fact that Mate30P's night mode is only ~21mm equiv. which is halfway between the other two UWA(~12-13) and the mains, in fact leaning toward the FoV of the mains is another issue to consider.
    Second last sample: Apple's algorithm of being able to extract data from relatively sufficient light but complete inability to work with deeper shadows in low light reminds me of the Kandao RAW+ I tried out recently, it's similar in that decent data from a single frame could be greatly improved when stacked, but really poor data from a single frame is left untouched no matter how many frames you stack, which could result in very clean highlights yet completely useless shadows at a certain noise level.
  • Andrei Frumusanu - Friday, November 8, 2019 - link

    Thanks for your input, great post.
  • s.yu - Saturday, November 9, 2019 - link

    :)
  • s.yu - Saturday, November 9, 2019 - link

    Thanks for the great samples, your hands seem even steadier than before and the framing is far more consistent than what I expect of handheld comparisons.
  • hoodlum90 - Friday, November 8, 2019 - link

    Great analysis and I agree with much of your comments. Each of these cameras provide a slightly different look with different emphasis placed on Noise Reduction / Detail, White Balance, shadows, highlights, etc. I think a lot of it comes down to your individual preference.

    Personally I do not like how Samsung applies a lot of noise reduction which impacts detail. But I know there are many that prefer this.approach.

    The Night scenes seam to provide the greater variance. One example is the Castle where the S10, Pixel and iPhone provide similar renderings with differences in white balance, noise reduction and detail. The P30 Pro seems to take a different approach that I do not like. In this scene the P30 Pro provided a more flat lighting that seems unrealistic. The background trees are the brighter green and the sky has lost all cloud detail. This reminds me of the HDR from the previous gen iPhones that also did not look realistic.
  • s.yu - Saturday, November 9, 2019 - link

    "Personally I do not like how Samsung applies a lot of noise reduction which impacts detail."
    Yeah, me neither. What's even more worrying was that the Note10 series had that up a notch compared to the S10 series which was at least largely on par with Note9 and S9. There's hope for a different turn if S11 uses that new HMX, or at least 27MP would still match the pixel downsampled to 12MP, but there's also speculation that the flagship won't use the best and largest sensor(seeing how Mi Note 10 is close to 1cm thick it makes sense).

    "The Night scenes... that I do not like."
    Yeah, me neither. :) It's too flat, not only from Andrei's review but also from GSMA's review that came out faster with far fewer samples. In some instances P30P/Mate30P would suppress highlights so much that a night scene is turned to a dusk scene as street lamps look dim and weak and cast light not significantly brighter than the surroundings. It's completely overdone and it's a mistake I wouldn't have made 10 years ago when I just started to post process my photos.
    Also if you look closely at the trees I would suspect that it's artificially colored. RYYB should distinguish green poorly(it leaks longer wavelength through by design after all) and the green looks mushy with a fake-looking color tone leaning towards cyan. I wouldn't be surprised if this color was filled by the so-called "AI" as it might have recognized trees when the camera failed to determine the actual color in that area, while it's not precise enough for a more convincing fill.
  • hoodlum90 - Friday, November 8, 2019 - link

    There is a little trick to force the Night Mode option on the iPhone 11. The iPhone seems to determine the mode based on what it is focusing on. If you manually focus on the one of the darker area in the night photos where Night Mode was not an option, then it would show up as an option. You just need to remember to manually lower exposure as the scene may end up brighter than expected. Something similar can be done to enforce the "deep fusion" mode in certain instances although that is more difficult to determine as the iPhone doesn't tell you when this mode is used.

    This seems to be a quirk that Apple can easily fix but for now needs to be worked around manually.
  • Spencer1 - Sunday, November 17, 2019 - link

    Thanks for this nugget. I’ll try it when I get my unit.
  • Hulk - Friday, November 8, 2019 - link

    I only care about the following things when checking out a phone.

    How does it feel in my hand?
    How is the display?
    How long does the battery last?
    How fast does the camera open?
    How good is the camera?

    Honestly all high end phones are pretty good on these points. I have a Pixel 2 because I'm a Google Fi subscriber. I'll pass on the Pixel 4. I don't see any significant improvements on the points I listed above.

Log in

Don't have an account? Sign up now