I admit that I buy a new iPhone every year for the camera upgrades. My iCloud Photo Library is up to 14,633 photos and 1,261 videos and growing. Almost every one of those were captured with an iPhone so its camera matters to me. Last year’s iPhone 6s introduced Live Photos which I absolutely love. (This from November is just one of my favorites; you can view it on the web thanks to Tumblr’s new Live Photo feature).
This year includes Portrait mode on the iPhone 7 Plus which intelligently applies a blur effect over backgrounds on still shots. Portrait mode photos often look like they were shot on dedicated cameras, not smartphones. The feature requires the dual camera system on the iPhone 7 Plus to work and is only available in beta on iOS 10.1 beta for now. Apple wants to fine tune Portrait mode’s performance before it hits primetime, and there are a few things we can already learn about it through testing.
First, people are really impressed by Portrait mode shots when you share them on social media and suspect they came from the iPhone 7 Plus and not a dedicated camera. Photos in Instagram are reduced to such a low resolution and so much detail is lost that flaws in the blur effect are harder to see (example). I shared a few shots on Facebook last night too, for example, and my colleague Dan from 9to5Toys had to verify they were from iPhone 7 Plus. He described the samples as absolutely gorgeous with no difference between nice point-and-shoot cameras.
Depth effect photos are easily sharable and people instantly appreciate their quality. Live Photos, on the other hand, have a slight learning curve and generally only play nice in specific apps. Live Photos were also harder to edit until later software updates. Depth effect photos work with all of the usual editing tools including cropping, rotating, applying filters, and changing color/light/B&W levels. You can’t remove the depth effect after you capture the shot, but by default the Camera app captures a regular photo with the depth effect photo so you can choose which to keep.
Using the depth effect feature is pretty simple too. My wife tried it before I got the chance and followed the tips in the Camera app to increase the lighting and find the right distance and focus. It’s a bit odd to see instructions like this in an Apple app, especially in the Camera app, but they’re effective at taking anyone form now really knowing about the feature to taking excellent shots with the blur effect.
The way blur effect is artificially applied makes me think of Photoshop and users who edit similar effects in post after a photoshoot. Except with the iPhone 7 Plus, you don’t need Photoshop or professional editing skills. Just follow the guidance in the Camera app and shoot great photos. Move closer to or further from your subject, make sure the lighting is decent, and you see the effect in real time.
I’ve noticed already that sometimes I find it hard to choose between the original photo and the blur effect version. My advice is keep both if they’re both good in certain instances. Other times either one or the other is clearly superior. Apple’s Photos app also creates a dedicated album called Depth Effect that only shows photos taken with Portrait mode if you want to only see the new style images.
Depth effect has the added benefit of blurring out unwanted background when you shoot. Sometimes the background is exactly what you want. It puts the photo in context. Proves that you were actually at the Grand Canyon! But sometimes the background includes an overflowing trashcan or a pedestrian making a funny face. Blur effect can fight the photo bombers.
Portrait mode is a beta feature within a beta operating system, so not everything is perfect yet. Apple can fine tune a lot in software so the feature will likely continue to get better over time, but it’s already quite impressive. I’ve noticed that the Portrait mode has a difficult time properly applying the blur effect when capturing the side of someone’s face. This likely varies on the environment, but in my testing the outline of the face lacked the blur effect. You see it in preview, though, so you don’t waste your time with a bad capture. The more complex the outline of the subject, the harder it is to capture correctly.
Because the blur effect relies heavily on a subject being within a certain distance in specific lighting, Portrait mode just isn’t ideal for capturing motion. Portrait mode is optimized to work with people, but it can be difficult to convince a three-year-old to pose still. This was especially clear when snapping my kid swinging at the park. Portrait mode just doesn’t work well with the back and forth fast movement. Burst mode or Live Photos will work better but you’ll miss the blurred background. (Side note on Live Photos with iPhone 7 Plus: switching between 1x and 2x optical zoom is totally captured like other motion so there’s a cool effect to try. Here’s an example.)
Finally, Portrait mode does not work with Live Photos. You have to pick one or the other; even the normal photo that Portrait mode can capture isn’t a Live Photo. This is certainly a limitation of the current hardware so I don’t expect this to change. I’ve used my iPhone 6s to shoot photos in some instances when I might have been better off using my Nikon 1 because I love the motion and sound captured with Live Photos. Now that same question of Live Photos or better still photos is back, but this time all on my iPhone. Maybe the iPhone 10 will do it all!
Overall, I’m very pleased with what Portrait mode has to offer. I’m also eager to see how much better the already good feature improves throughout the beta and in future software updates. If you rely on your iPhone for most photography and you capture a lot of photos like me, it’s a seriously compelling reason to opt for the iPhone 7 Plus over the smaller model.
1x and 2x optical zoom without using Portrait mode isn’t so bad either: