Home United States USA — software What's so ‘Fusion’ about the iPhone 16’s 48MP camera?

What's so ‘Fusion’ about the iPhone 16’s 48MP camera?

85
0
SHARE

A camera with multiple personalities
A new word is being thrown about to describe the iPhone 16 family’s main rear camera. It’s the 48MP Fusion Camera.
The resolution? That’s not new. But the other word, Fusion, is. Apple uses the term Deep Fusion for its computational photography processing, which helps dramatically improve low-light shooting in particular. But it didn’t describe the iPhone 15 as having a Fusion Camera in the last generation.
So what’s up?
Is ‘Fusion’ used in part to make it seem the iPhone 16 gets more of a camera upgrade than it really has? Sure, probably. But there’s also some substance behind the name. Potentially.
Getting to the bottom of this one could also help you make the most of an iPhone 16 or iPhone 16 Pro camera. It’s time for a closer look. 48MP Fusion camera: the cynical take
Why Fusion? This term has quite the history among cameras, given it doesn’t always refer to a single, specific piece of technology.
Iconic film director James Cameron’s 3D shooting array was dubbed the Fusion Camera System. It’s a dual-camera arrangement, used to capture video in native 3D for loads of films including the original Avatar, back in the (most recent) heyday of 3D cinema. Cameron co-developed it with director of photography Vince Pace.
GoPro’s first 360-degree camera was also called the Fusion, released back in 2017. It was the precursor to today’s GoPro Max, and knitted together the feeds of a pair of cameras with fisheye lenses, one on each side of its body.
The term Fusion has a fairly powerful sense of merging physicality and technology. And in both of these cases it refers to combining the efforts of two separate camera sensors and lenses to achieve a specific goal: 3D footage or 360-degree photos and video.
Does the same apply to the iPhone 16?iPhone 16: one camera, three personalities
The iPhone 16’s Fusion camera isn’t quite like the GoPro Fusion or James Cameron’s cinematic rig, though. Its 48MP primary rear camera gets the Fusion label, not the entire rear array.
And there are actually several feature candidates for the reasoning behind the name. The first is all about how Apple exploits the style of sensor used in the iPhone 16, and how it can take on multiple identities. It can behave like a 48MP camera, a 12MP one and a 2x zoom Apple claims to have the quality of a native zoom. But how?
Just about all very high-resolution phone camera sensors use what’s known as a Quad Bayer array. If you want to get your head around phone cameras, you need to know a little about this kind of sensor.
When light enters a camera lens, it passes through a color filter and hits the light-gathering part of the sensor, called a photosite. There are many millions of these in a camera sensor.
In a standard camera sensor, there are red green and blue color filters for each pixel in the final image. And a sub-pixel photosite for each colour underneath. Little red, green and blue-serving dots’ information is collated to form a pixel of image data in your actual photo.
With a Quad Bayer array, there are blocks of four photosites under each of the red, green and blue colour filter sections, not one.

Continue reading...