Why New Smartphone Cameras Feel Worse
7:20
Watch on YouTube ↗
M
MKBHD·Tech

Why New Smartphone Cameras Feel Worse

TL;DR

Excessive computational photography — HDR, tone mapping, face detection — applied to normal daylight shots makes newer phones produce overprocessed, unnatural-looking images.

Key Points

  • 1.Modern smartphones have essentially plateaued in basic daylight photo quality. Any phone from the past 5 years produces perfectly usable daylight shots — an iPhone 11 and iPhone 17 look nearly identical without pixel-peeping, so brands no longer compete on good-light performance.
  • 2.The real camera battleground is now extreme edge cases. Low light, fast-moving subjects, and heavily backlit scenes are where newer phones differ, using multiframe HDR, tone mapping, and face detection — demonstrated by a Pixel 10 correctly exposing a fully backlit scene that a Nexus 4 completely failed.
  • 3.Over-applying these computational tricks ruins normal photos. The Samsung Galaxy S progression shows the S9 introducing smart HDR as a breakthrough, but by the S26 the same indoor shot shows unnatural haloing around windows, glowing skin tones, and a flat, over-processed look that many viewers prefer the S23's version over.
  • 4.The fix is knowing when to turn processing up or down. Brands already know these tricks can look weird in excess; the balancing act is suppressing them in easy conditions, and some third-party camera apps can reduce post-processing — links are provided in the video description.

Life's too short for long videos.

Summarize any YouTube video in seconds.

Quit Yapping — Try it Free →