Samsung’s Moon Shots Force Us to Ask How Much AI Is Too Much

And unlike, for example, the Eiffel Tower, its appearance will not change drastically with the lighting. Moon shots typically only happen at night, and Samsung’s processing breaks down when the moon is partially obscured by clouds.
One of the clearest ways Samsung is messing around with the moon is by manipulating the midtone contrast, making its topography more pronounced. However, it’s also clearly capable of introducing the appearance of textures and detail not present in the raw photo.
Samsung is doing this because the 100x zoom images on the Galaxy S21, S22 and S23 Ultra phones suck. Of course they do. They involve massive cropping into a small 10MP sensor. Periscope zooms in phones are great, but they’re not magic.
credible theories
Huawei is the other big company accused of faking its moon photos, with the otherwise brilliant 2019 Huawei P30 Pro. It was Huawei’s last flagship to be released before the company went black in the US list, effectively destroying the appeal of its phones in the west.
Android Authority claimed the phone put a stock image of the moon in your photos. Here’s how the company responded: “Moon mode works on the same principle as other Master AI modes, recognizing and optimizing details in an image to help individuals take better photos. It doesn’t replace the image in any way – that would take up an unrealistically large amount of storage space since the AI mode recognizes over 1,300 scenarios. Based on machine learning principles, the camera recognizes a scenario and helps optimize focus and exposure to enhance details such as shapes, colors and highlights/lowlights.”
Familiar right?
You won’t see these techniques on too many other brands, but not for arrogant reasons. Unless a phone has at least a 5x wide-angle zoom, a moon mode is largely pointless.
Attempting to shoot at the moon with an iPhone is difficult. Even the iPhone 14 Pro Max doesn’t have the zoom range for it, and the phone’s auto-exposure turns the moon into a searing speck of white. From a photographer’s point of view, the S23’s exposure control alone is outstanding. But how “fake” are the S23’s moon images really?
The most generous interpretation is that Samsung uses the real camera image data and only implements its machine learning knowledge to massage the processing. This could help him, for example, to trace the outlines of the Sea of Tranquility and the Sea of Tranquility when trying to get a greater sense of detail from a blurred source.
However, this line is stretched in the way that the final image renders the position of Kepler, Aristarchus, and Copernicus craters with seemingly uncanny accuracy when these small features are imperceptible in the source. While you can infer where lunar features are from a blurry source, this is next-level stuff.
Still, it’s easy to overestimate just how much of an edge the Samsung Galaxy S23 gets here. The moon photos may look good at first, but they still suck. A recent versus video featuring the S23 Ultra and the Nikon P1000 shows what a decent sub-DSLR consumer superzoom camera is capable of.
A question of trust
The excitement surrounding this lunar issue is understandable. Samsung uses moon images to exaggerate its 100x camera mode, and the images are synthesized to some extent. But it’s really just poked a toe here from the ever-expanding Overton AI window that’s guided phone photography innovation for the past decade.
Each of these tech tricks, whether you call them AI or not, are designed to do what would have been impossible with the raw basics of a phone camera. One of the first and probably most momentous was HDR (High Dynamic Range). Apple built HDR into its camera app in iOS 4.1, released in 2010, the year of the iPhone 4.
https://www.wired.com/story/samsungs-moon-shots-force-us-to-ask-how-much-ai-is-too-much/ Samsung’s Moon Shots Force Us to Ask How Much AI Is Too Much