Samsung has published a blog post that seems to be in response to a viral Reddit post that showed how the company’s software seemingly adds a lot of non-existent detail to photos of the Moon The blog post is a lightly edited translation of a similar blog post in Korean posted by the tech giant last year. The viral Reddit post that accused Samsung of fake Moon photos In the viral post, Reddit user ibreakphotos explained how they believed the “Samsung space zoom shots are fake.” According to the user, they downloaded a high-resolution image of the Moon and downsized it while also adding a Gaussian blur to remove all details. They then displayed the image on their monitor in full screen and turned off all the lights in the room before moving to the other end of the room and taking a picture of the screen with their Samsung phone. The phone then seemingly added a lot of details where there is none. Samsung "space zoom" moon shots are fake, and here is the proof by u/ibreakphotos in Android “I have to stress this: there's a difference between additional processing a la super-resolution, when multiple frames are combined to recover detail which would otherwise be lost, and this, where you have a specific AI model trained on a set of moon images, in order to recognize the moon and slap on the moon texture on it,” wrote ibreakphotos in their Reddit post. Samsung’s explanation of what happened in the Reddit post In its blog post, Samsung claimed that its Scene Optimizer feature combines several steps to generate better photos of the Moon. This includes the company’s Super Resolution feature, which uses multi-frame processing to combine over 10 images to reduce noise and enhance clarity. It also combined optical and digital image stabilisation to reduce the image blur. But the “magic” seems to happen during the “AI detail enhancement engine,” which according to Samsung, tries to identify whether the Moon is in the image before “enhancing details” with the use of AI. Or as Samsung puts it, “After Multi-frame Processing has taken place, Galaxy camera further harnesses Scene Optimizer’s deep-learning-based AI detail enhancement engine to effectively eliminate remaining noise and enhance the image details even further.” This means that there is a blurry line between whether we are actually seeing the Moon or what the Samsung phone thinks the Moon looks like when we take an image of it using the camera. But at the end of the day, it is unlikely to offend all but the most hardcore camera and technology enthusiasts. The Moon is “tidally locked” to the Earth, which means that we always see the same side of it, no matter what. So, whether the software is adding the details or if they are actually present in the data from the camera, the image would technically be kind of accurate. Unless of course, you trick the phone into thinking something isn’t the Moon when it isn’t, like some exceedingly clever Reddit users.