Snapseed for iOS updates with new HDR Scape filter, pre-iOS 7 users beware

Part of today's Google+ event announcements centered on Snapseed, and some new features coming to the Google acquired photo editing software. Of all of them, one much touted new feature was the HDR Scape filter, and the iOS application has just been updated with all the newness. But if you're not on iOS 7 you should tread carefully.

The iPhone camera has been able to take a form of HDR images for some time now, but the demonstration at the Google event showed some pretty dramatic effects applied to your photos. True HDR requires multiple images, but Google is trying to replicate the effects artificially.

A word of warning goes out to anyone still using iOS 5 or iOS 6, though. There is a bug that causes problems with sharing and saving images, and Google says that an update will be released for this in due course. For everyone else, grab the latest version from the App Store at the link below. Where does Snapseed stand in your mobile photo editing arsenal?

Have something to say about this story? Share your comments below! Need help with something else? Submit your question!

Richard Devine

Senior Editor at iMore, part time racing driver, full time British guy

More Posts

 

10
loading...
0
loading...
27
loading...
0
loading...

← Previously

Google+ for iOS to get background image sync and full resolution uploads

Next up →

Popular Reddit app BaconReader makes its way to the iPhone

Reader comments

Snapseed for iOS updates with new HDR Scape filter, pre-iOS 7 users beware

9 Comments

(serious questions; I'm green on HDR specifics)

Isn't the multiple pictures a way around achieving what the algorithm is doing? Isn't multiple images the way you "had" to do it because software was incapable before?

In order to achieve true HDR, you still need multiple images. Camera sensors are nowhere near as sensitive as the human eye. If you expose a photo so the sky is properly exposed, the ground will be underexposed (too dark). Likewise, if you expose for the ground, the sky will be overexposed (too bright). HDR takes several photos, at different exposures, and takes the properly exposed part of each photo and merges them into one photo so the entire scene is properly exposed.

So, to reiterate, if you take an image and -2/+2 on exposure then "merge" them, how is that artificial?

I can see where the software may be less accurate this way but article seems a stretch.

I guess I'm a little confused as to your question. Taking multiple pictures at various exposures and merging them is not artificial. That's what HDR is.

What Snapseed is doing is taking one photo and creating a fake HDR image out of that one photo by (i assume) boosting the shadows and decreasing the highlights for you. I'm curious as to how this will work with real world photos, since JPG's are already compressed and data is lost.

Since you are only using a single image and adjusting the exposure on various parts of the photo, that is artificial. No algorithm in the world can bring back detail from over/underexposed parts of the photo that isn't there.

Snapseed #1 and PicsArt #2, are the first apps i download to any of my iOS devices or even my GS4 (please apple give me a 4.8-5" iPhone so i can ditch this toy!) .. Then comes the rest of the Arsenal!