Image Source: AsiaOne
When Apple Inc introduced its triple-camera iPhone this week, marketing chief Phil Schiller waxed on about the device's ability to create the perfect photograph by weaving it together with eight separate exposures captured before the main shot, a feat of "computational photography mad science". "When you press the shutter button it takes one long exposure, and then in just one second the neural engine analyses the fused combination of long and short images, picking the best among them, selecting all the pixels, and pixel by pixel, going through 24 million pixels to optimise for detail and low noise," Schiller said, describing a feature called "Deep Fusion" that will ship later this fall. It was the kind of technical digression that, in years past, might have been reserved for design chief Jony Ive's narration of a precision aluminium milling process to produce the iPhone's clean lines. But in this case, Schiller, the company's most enthusiastic photographer, was heaping his highest praise on custom silicon and artificial intelligence software.
Source: AsiaOne