Image Captured By iPhone Shows Three Different Versions Of The Same Person, Revealing An Error In Apple’s Computational Photography
01.12.2023 - 14:57
/ wccftech.com
The iPhone has relied on Apple’s computational photography techniques for years, and one would assume that the company’s algorithm for image processing would only improve with time until a woman posed in front of a mirror wearing her wedding dress. The captured result shows that the person in the image looks different than her own reflections, indicating errors in Apple’s processing techniques.
A U.K. comedian and actress, Tessa Coates, was trying out a wedding dress and decided to pose in front of the mirror, with someone behind her taking a picture from an iPhone to show how she would look from various angles. After inspecting the photo herself, Coates was perplexed to see that instead of one person, there were three different people in the image, but in reality, those were only reflections of that person. Coates mentions what she experienced on Instagram below.
“Okay hello! Upgraded from stories to the grid. I went wedding dress shopping and the fabric of reality crumbled. This is a real photo, not photoshopped, not a pano, not a Live Photo. If you can’t see the problem, please keep looking and then you won’t be able to unsee it. Full story in my highlights (THE MIRROR) Please enjoy this glitch in the matrix/photo that me nearly vomit in the street.”
So, what is really going on here? With PetaPixel’s investigation and a little help from AppleInsider, it was concluded that Apple’s computational photography could not distinguish between a reflection and a real person. The iPhone’s camera pretty much thought that it was capturing three different people in a single frame, so it produced a result where Coates’ hands were in a different position, but her reflections’ hands were elsewhere.
While Coates was moving, the iPhone camera’s shutter was tapped, capturing several images instantly. Apple’s computational photography then stitches those images together to create a single one out of the several taken, matching the best combination for saturation, contrast, detail, and lack of blur. Unsurprisingly, the processed image caught Coates by surprise, and she was determined to get to the bottom of this, so she made her way to an Apple Store outlet, where she met a technician named Roger.
Roger told Coates that the iPhone is not a camera but a computer, which takes a burst of images extremely quickly and stitches them together. Roger also tells the comedian that what happened with Coates’ reflection was a ‘one in a million’ incident, indicating that it was rare to replicate the same result. However, this image is likely gaining traction and most likely garnering the attention of other people, including YouTubers, who will attempt to recreate the same image in the same fashion.
News Source: PetaPixel