The iPhone 11 Pro and Max models have recently dropped, and like every year, I’ve picked up the new model. This year’s headline feature is the new triple camera setup, now with night mode.
For the purposes of this article, I want to focus on just the video. If you’re looking for a review of the cameras for photography, I have another article I’ve written about that.
Right off the bat, the 2019 iPhones can do some things that real cameras, even expensive full frame, mirrorless models can’t do. Things like:
- 4K video at 60p (as well as 30p and 24p)
- Extended dynamic range using computational photography at all resolutions and framerates
- Stabilization that is smooth enough for walking and other large movement
- 3 Built-in lenses
- A large, very colour-accurate viewfinder (the phone screen)
- Hours of recording time, without any overheating whatsoever
And that’s really just the camera by itself, for video. Of course, being part of a smartphone means that you can easily edit and share that video anytime, anywhere. In fact, the A13 Bionic chip is so fast, it’s very likely a good deal faster than your computer(s) unless you have a new and pretty high end setup.
As a point of comparison, it has almost exactly the same multicore scores in Geekbench 4 as my 2017 5K iMac, with even faster single core performance. You can actually notice this if you use it for photo or video editing, say, to edit those 4K60p videos you take.
That computational photography element I mentioned earlier is a game-changer. In photography/video, it’s pretty common knowledge that larger sensors means more dynamic range, more colour depth and better low light (high ISO) performance, as well as more depth of field control (the ability to have shallow depth of field).
One by one, computational photography is tackling each problem, beginning with dynamic range. In video, the iPhone actually shoots at 120p and combines the data from two or more frames to maximize dynamic range in the scene. How does this work? Amazingly well. You actually get more dynamic range with the iPhone than you do with a full frame Sony A7III in video. The highlights are almost never blown, and the shadows have tons of detail.
In terms of sharpness, the 4K from the iPhone is generally very good. They’ve had 4K video since the iPhone 6S, but until the iPhone XS generation, it wasn’t very detailed. It’s now great… enough that you can crop 4x and get a good 1080p video out of it. This also means that you can get good quality video at 4x zoom in good light, and 6x if you’re willing to accept 720p (which I’m not).
Then there’s colour depth. This is still something where improvement is needed, and an area where cell phones lag behind. To my eye, it is the most noticeable difference between the iPhone and real cameras. The amount of colour the small sensor can capture is quite different, but really, most people don’t notice stuff like that, because the processing turns up the saturation. And even though we have seen steady improvements in this area over the years, I find the lack of colour information far more noticeable in video than pictures. I’d love to see Apple up the video spec to 10-bit HDR next year.
The triple camera setup on the iPhone 11 Pro Max is very versatile – close to an ideal kit for me, in fact. However there are some things to note.
The ultrawide lens lacks autofocus, as well as optical image stabilization. Unfortunately, the lack of autofocus means that you can’t do one of the things that ultrawides are great for – getting up close to your subject for maximum distortion. Subjects 1 m and beyond are in focus, but anything closer is affected.
Both the ultrawide lens as well as the telephoto lens use smaller sensors than the main 26 MP camera, as well as smaller apertures, which adds up to significantly poorer video in low light. The ultra wide is definitely not recommended in even somewhat low light, and for the telephoto lens, Apple even disables the telephoto lens and just crops the main lens to 2x without even telling you. I think that since you can crop the main lens to 2x without a massive drop in quality (still better than the actual telephoto lens in bad light anyway), Apple should have gone with at least a 3x lens instead of the 2x.
The lack of optical image stabilization on the ultrawide doesn’t matter that much, as the iPhone relies mostly on software-based stabilization for video anyway. In good light, the stabilization is fantastic on all 3 lenses. It is close to looking like it’s on a gimbal… very cinematic, and great for videos where you’re walking.
But when the light drops, there is an effect I’m not too fond of that I call “shimmer.” The iPhone, because of its small sensor, drops the shutter speed as quickly as it can to 1/30 or even 1/24 if you have enabled that option in the menu. This means each frame now has a more-than-acceptable amount of motion blur. After applying the stabilization, you see things like lights appear stationary in their relative position in the frame, but blur and distort at the same time. This is something that you see on most small sensor devices that employ software stabilization in low light, but can be minimized by shooting at 60p all the time. You’d think that would mean more noise, but in my tests, the difference was not noticeable at all.
So… Can It Replace a Real Camera for Video?
I toyed with the idea of carrying an RX100 V or VII instead of using my phone for everyday videos, and actually ended up buying a V. I chose that model because it has superbly sharp 4K video that is downscaled from 5.2k (the full sensor readout) with great colour and decent low light performance. However, I found it difficult to carry around all the time due not only to its fragility, but also its size and weight. I found it annoying to charge yet another device. And while I appreciated that it had more features and colour profiles (S-LOG even), I found myself questioning why I needed those. If I really wanted better results, why wouldn’t I just use my A7III? Finally, and this was the biggest reason… the stabilization made the RX100 useless for any kind of walking shots. Any motion faster than a deliberately well-controlled pan would appear jerky and really, a pain to watch on a TV later. In the end, I decided that I would use my phone for everyday pictures and video, and keep the A7III handy for when I wanted the best quality.
So I suppose it depends on a few factors, but I’d say the answer is yes, the iPhone 11 Pro/Max can probably replace your camera.
It’s super fast to shoot, has great dynamic range, great battery life, amazing stabilization, very good quality for the most part, and 3 lenses to choose from for different focal lengths. In a pinch, you can even get the equivalent of over 310mm zoom @ 720p. Finally, you can easily edit and share these videos on the spot, which just seals the deal for me, as an everyday camera.
Here’s a little sample: