Smartphone cameras are getting better every year, but compact sensors still have many limitations, especially when it comes to low-light photography and zooming. Samsung was one of the first to add a periscope lens to its smartphones for better zoom. With this lens, users can even take pictures of the moon with their phone. But this has become a controversy as some users have questioned whether these images are actually real. And I say they are.
Modern smartphone cameras and AI
It is now impossible to talk about smartphone cameras without mentioning improvements made by artificial intelligence. Pretty much every phone maker from Apple to Google to Samsung is using AI to improve the photos taken by users. Such technology can in many cases alleviate the lack of a large camera sensor.
For example, Apple introduced Deep Fusion and Night Mode with the iPhone 11. Both technologies combine the best parts of multiple images with AI to result in a better image when there is little or no light. The iPhone and other smartphones also use AI to make the sky bluer, the grass greener and the food more appealing.
Some phones do this more aggressively, others less so. But the fact is that almost all of your photos taken with smartphones in recent years have some form of modification done by AI. You don’t actually capture what you see with your eyes, but what your smartphone thinks will look best in a digital image.

Photos of the moon taken with the Galaxy S23 Ultra
Taking photos of the moon is quite challenging, as it is a super distant luminous subject. Focusing on it is not easy and you still need to set the correct exposure to get all the details. It’s even more challenging for a smartphone camera.
You can take pictures of the moon with your iPhone if you use a camera app with manual controls, but they still won’t look good because of the distance and the lack of a wider optical zoom.

One of the main features of the Samsung Galaxy S Ultra phones is the periscope camera, which enables up to 10x optical zoom. Combined with software tricks, users can take pictures with up to 100x zoom using these phones. The iPhone is rumored to get a periscope lens in the next generation, but for now it only offers a telephoto lens with 3x optical zoom (if you have a Pro model).
With such a powerful zoom, Samsung promotes the camera of the Galaxy S23 Ultra, its latest flagship, as capable of taking pictures of the moon. And indeed it is.
Last week, a user on Reddit conducted an experiment to find out whether the moon photos taken with the S23 Ultra are real or not. The user basically downloaded a picture of the moon from the internet, reduced the resolution to remove detail and pointed the phone at the screen that showed the blurry image of the moon. Surprisingly, the phone took a high-quality picture of the moon.
The internet was immediately flooded with people and news sites claiming that Samsung’s phones are taking fake pictures of the moon. Did Samsung lie all along? Not really.
Here’s how Samsung’s phones take pictures of the moon
Contrary to popular belief, Samsung does not replace user-taken photos with random, high-quality moon photos. Huawei’s phones, for example, actually do this. Instead of using AI, the system uses pre-existing images of the moon to create the final image. Samsung, on the other hand, uses a lot of AI to deliver good moon shots.
Samsung has an article on its Korean website that describes how the camera algorithms work in the smartphones. According to the company, all phones since the Galaxy S10 use AI to improve images.
For phones with a periscope lens, Samsung uses a “Super Resolution” feature that synthesizes details that were not captured by the sensor. In this case, when Samsung’s phones detect the moon, they immediately use AI to increase the contrast and sharpness of the image, as well as increase the resolution of the details in it artificially.
This feature works in much the same way as Pixelmator’s ML Super Resolution and Halide’s Neural Telephoto. You don’t exactly replace your image with another, but rather use a bunch of algorithms to reconstruct it in a better quality.
I did the test myself with a Galaxy S22 Ultra, which also has a periscope lens. I used a third-party camera app to take a RAW image of the moon at 10x zoom. Then I took a photo in the same position, but with Samsung’s camera app. If I overlay both images, I can clearly see that they are the same. But the processed version has many more details that were added by AI to make the image prettier.

What is a real photo and a fake?
I wrote this article after watching MKBHD’s latest video where he questions “what is an image?” As mentioned before, all smartphones today – including the iPhone – use algorithms to improve images. Sometimes this works very well, sometimes not.
I recently wrote about how Smart HDR has made my photos look too sharp and with exaggerated colors, which I don’t like. Since the iPhone XS, many users have also complained about how Apple has softened skin in photos. Are the photos taken with my iPhone fake? I do not think so. But they certainly don’t look 100% natural either.
Samsung has done something really impressive by combining the software with the periscope lens. And if the company doesn’t replace users’ photos with alternative photos, I don’t see why this is a bad thing. Most users just want to take good pictures, no matter what.
I’m sure if Apple eventually introduces a feature that uses AI to make moon photos better, a lot of people will love it no doubt. And as said by MKBHD, if we start questioning the AI used to enhance an image of the moon, we need to question every image enhancement done by AI.
Also read:
FTC: We use monetized auto affiliate links. More.