Home News Engineer explains why Samsung Galaxy S23 Ultra’s moon photos are not faked

Engineer explains why Samsung Galaxy S23 Ultra’s moon photos are not faked

0

The photos of the moon taken by the Samsung Galaxy S23 Ultra phone were suspected of being fake. Although the official denied this claim, some netizens still did not buy it.

Eric is a Youtube blogger and an engineer with 12 years of experience in hardware engineering. Recently, he used mathematics and logic to analyze how to distinguish real and fake pictures in a video, and pointed out that Samsung may have been unfairly criticized.

Before starting to explain, Eric stated that he was not defending Samsung, but wanted people to better understand why these pictures had such an effect. First, he asked viewers how to tell the difference between real and fake pictures. He then showed a photo of the moon taken with an iPhone 14 Pro Max, where only a white spot was visible.

However, although the photo taken by the iPhone 14 Pro Max is real, it does not accurately reflect the appearance of the moon that people saw when it was taken, which is the purpose of the camera: to preserve the memory of what you saw with your own eyes And revisit.

The YouTube blogger also mentioned that Samsung explained the technology it uses in an article. To enhance the effect of the moon photo, the camera takes at least 10 pictures and uses image processing technology to remove blemishes and noise. Then, the sharpest parts of these moon photos were combined into a single image, and the image was further enhanced using AI trained to recognize the various phases of the moon.

Eric also mentioned u/ibreakphotos, who used Gaussian blur and other steps on Reddit to prove that Samsung misled consumers with “enhanced” moon photos. However, Eric said that netizens using Gaussian blur to imitate moon photos are the exact opposite of what Samsung does: Samsung uses convolutional neural networks. Eric also said that netizens lost a lot of detail when reducing the moon image to a size of 170 x 170 pixels, and applied Gaussian blur. When processing such a small amount of detail, Samsung AI has to make a best guess about what it’s processing, so the end result won’t exactly match the original downscaled image.

Due to Samsung’s Convoluted Convolutional Neural Network running, the final image results in a brown tint
The reason why the image is brown and the edge quality is degraded is that the Samsung convolutional neural network is working. When the brown tone is removed, you can see the picture very close to the picture before the netizen applied Gaussian blur.

The problem with this sequence of events is that Samsung has misled consumers into believing that the cameras of the company’s flagship smartphones are doing all the work, not AI. This may make users think that if the moon can be shot so clearly, then the photos taken with 100x zoom can be so clear, which is not the case.

Exit mobile version