mastodon.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
The original server operated by the Mastodon gGmbH non-profit

Administered by:

Server stats:

360K
active users

Reddit user ibreakphotos discovers that Samsung's 'Space Zoom' simply replaces user's moon photos with higher-res images of the moon through a clever testing process.

reddit.com/r/Android/comments/

This isn't computational photography — it's inserting imagery that simply isn't there.

Tobias Due Munk

@halide it _is_ there though. The phone just isn’t able to capture it.
Why should my photos that include the moon not look like what I was seeing when took the picture?

@tobiasdm @halide because this isn't adding the detail you saw, it's adding manufactured, pre-stored data that doesn't match reality. It just infills a texture. Do you want your camera to do this for say, your kids' eyes too?

@sdw @halide if it knew what my kids eyes looked like, probably. I’m just taking a picture of the moon though, so doesn’t have the same sentimental value to me as my kids eyes. I wish the camera sensor could capture what I saw better, but this seems like a decent workaround for now.
I can’t agree with the this-is-fake response since all photos are fake compared to what _I_ saw anyways. We need brain sensors for that 😊
I just want a picture of how I saw something. And the moon had texture.

@sdw @halide and isn’t Halide’s 3x zoom on non-Pro phones technically using pre-stored data?

@sdw @halide hehe certainly an unpopular opinion judging by all the comments 😛
But can we get closer to a technical explanation of the different type of enhanced photography we’re seeing these days? Computational photography, super resolution, machine learning. I’m not sure I see a red line of what is ok or not.

I do see a conservative “real photography” group with gate keeping and other stuff when technology and culture moves forward/on with their definition of red line changing over time.

@tobiasdm @sdw @halide
I think what Samsung does is much closer to image generation than photography. It is quite similar to something like Dall-E (but smaller in scale due to the narrower use case), with the difference that instead of writing a prompt like "Photo of the moon with a crater of the right and top right and some dark spots close to the center" you give it a low resolution photo containing the same information and the NN produces a picture.

@tobiasdm @sdw @halide imagine if instead of a camera, this were a telescope, and you were an astronomer. Photography is a form of knowledge discovery. It shouldn’t extrapolate.

@tobiasdm @sdw @halide It is fake. It 'deceives' you by suggesting that your phone captured this detail and it 'replicates' the detail from other photos. That is what fake is. You might be ok with it in this context, but you might not in others.

@tobiasdm @sdw @halide I find this is highly deceptive. It gives a false idea of what technology can do and at the same time it gives a dangerous precedent of personal computers replacing reality with simulation with zero transparency.
I wouldn't wanna put this sort of a "Santa Claus" inside the tech my children get familiar with. I would want my children to grow up with a grasp on what's going on.

@sdw @tobiasdm @halide It's also necessarily going to be badly mismatched with the surrounding photo, at least in some cases. Clouds, color from atmospheric effects, focus, etc.

@sdw @tobiasdm @halide It's important that we continually draw attention to this kind of garbage faking "photographs", because these "photographs" will end up getting entered into evidence in trials and sending someone to jail based on something some AI bro's creation pulled out of its ass. Maybe it won't be the moon that does that, but the sooner the public starts to understand the scope of the fakery, the better.

@sdw is it tough?

This looks like aggressive image sharpening on the blurred image and whatever noise the camera chip introduces.

To test for pre stored detail you'd have to use an image of a cratory white object that isn't the moon and see if a moon is being put on top of it.

@betalars @sdw The picture on the left is a 170x170 pixel photo of the moon that was blurred, then blown up and shown on a computer screen, which was then photographed with the phone – resulting in the right picture. There is no mathematically possible way to get the detail on the right from just the picture on the left.

@betalars @sdw I recommend reading the full post. They do numerous tests, including clipping the whites of the test image, and it always results in the same detailed photo of the moon.

@chrisk okay the last edit had me convinced, I didn't read as much.

@sdw @tobiasdm @halide

Imagine a massive meteor strike on the moon that adds a new deep crater down there in the south where it's mostly flat.

Everyone's trying to take photos of the new crater, but the phones are removing it and adding in the pre-loaded moon texture instead.

@tobiasdm @halide Why stop it there, then? Get rid of the camera entirely. Ask DALL-E to render everything you want images from. Much cheaper.

@tobiasdm @halide No, the picture on the left is of a computer monitor in a dark room showing a full-screen image that was digitally blurred. That is, this is not a photo of the moon, it is a photo of a badly blurred image of the moon. The detail on the right is fabricated and does not match what you would've been seeing in person.

@tobiasdm @halide But in this case you weren't seeing the moon, you were looking at a low-res blurry picture on a computer screen.

@tobiasdm If you want to look at a photography someone else has taken, why do you bother with using a camera in the first place? Just go and download one. There will be far better that whatever you would be taking with a phone camera.

@halide

@tobiasdm it literally is not there in this example. And they're lying about what the "camera" is doing

@tobiasdm @halide You don't value authenticity, but can't you see why many other people do?