High dynamic range (HDR) mode is becoming a standard feature in newer digital cameras and smartphones. By snapping multiple photographs at different exposure levels, the camera can automatically generate an image that captures a greater range of light and dark areas than a standard photograph. However, the technique does have its weaknesses. Artifacts appear if any changes occur in the scene between the different shots, which limits the scenarios in which the technique can be used.
Apple wants to overcome this issue by implementing an HDR mode that only requires a single exposure. A recently published patent shows that Apple is well on its way to doing so. Read more…
When doing certain types of welding, special helmets with dark lens shades should be used to protect the eyes from the extremely bright welding arc and sparks. The masks help filter out light, protecting your eyes, but at the same time make it hard to see the details in what you’re doing. In other words, the dynamic range is too high, and wearers are unable to see both the arc and the objects they’re welding.
A group of researchers in the EyeTap Personal Imaging Lab at the University of Toronto have a solution, and it involves cameras. They’ve created a “quantigraphic camera” that can give people enhanced vision. Instead of being tuned to one particular brightness, it attempts to make everything in front of the wearer visible by using ultra high dynamic range imaging. Read more…
Trey Ratcliff is the well-known and well-loved HDR photographer behind the travel photography blog Stuck in Customs, and in this behind-the-scenes video he talks you though his gear and how he sets up a few shots of this rocky beach in the Virgin Islands. The video offers some great insight into Trey’s thought process as he composes the resulting HDR images, one of which you can see in higher resolution (including some 100% crops) here.
Washington Times video producer Drew Geraci created this amazing time-lapse video of an asylum that was abandoned decades ago.
Opened in the early 1920s, the Asylum closed down and was abandoned decades ago. Rooms remain untouched – left as they were when the last of the employees departed. These buildings stand as a testament to the horrors and miss treatment that patients had to endure during the time of its operation.
Our 7 month journey into the Asylum led us on many adventures; from dodging security vehicles, ghostly figures and even a meth head. This is no place for the faint of heart. Asbestos blanketed every room we entered like new winter snow, so shooting was sometimes difficult.
They used a combination of traditional HDR, tone-mapping, and time-lapse techniques. Every single frame was a still photo, and they snapped 35,000 of them with two Canon 5D Mark II DSLRs over 7 months to complete the project.
Here’s a tutorial on how to do non-automated HDR for real estate photography using Photoshop CS5. The first thing you’ll need is a sturdy tripod with a level. The closer you are to a leveled image, the less correction you’ll have to do later. Read more…
The Washington Post raised some eyebrows last Friday after running an uber-saturated front page photo with the caption stating that it was “a composite created by taking several photos and combining them with computer software to transcend the visual limitations of standard photography.” After emailing the photo editor, Poynter learned that the image was simply an HDR photograph. While it’s a pretty common technique these days, some believe that it has no place in photojournalism,
Sean Elliot, president of the National Press Photographers Association, said, “HDR is not appropriate for documentary photojournalism.” The organization’s code of ethics say photographers should respect the integrity of the digital moment, “and in that light an HDR photo is no different from any other digital manipulation.”
“By using HDR,” he told me by email, “The Washington Post has combined different moments, and thereby created an image that does not exist. The aircraft visible in the final product was not there for all the other moments combined into the final, and that alone simply raises too many questions about the factual validity of the actual published image.” [#]
What complicates matters is that many new cameras (e.g. Nikon D4, Apple iPhone 4S) offer HDR features that create single images from multiple exposures in the camera. The Washington Post published a response to the controversy yesterday. Do you think HDR is an appropriate technique for photojournalists to use?
Here’s an educational time-lapse tutorial by Los Angeles-based architectural photographer Mike Kelley in which he walks through how he goes about photographing buildings. His technique might be described “manual HDR” — after shooting the building over a longish period of time to capture different lightings, he then enters the scene and lights different areas of the building using two Canon 430EX Speedlites. Afterward, he loads the stills into Photoshop and selects different portions of the scene from different photos depending on the lighting he wants. The finished composite photo ends up looking as if it were lit by a large number of Speedlites.
Late last year we showed you an interesting demonstration of HDR video filmed using two Canon 5D Mark IIs. The cameras captured the exact same scene at different exposure values using a beam-splitter. Now, a new camera called AMP has been developed that captures real-time HDR video using a single lens. The trick is that there are two beam-splitters in the camera that take the light and direct it onto three different sensors, giving the system a dynamic range of 17 stops. Check out some sample clips in the video above — they might be pretty ugly, but the technology here is pretty interesting. Read more…
Here’s a good example of when HDR photography is useful: NASA created this image of the Space Shuttle Endeavour lifting off for the final time by combining six separate photographs.
Each image was taken at a different exposure setting, then composited to balance the brightness of the rocket engine output with the regular daylight levels at which the orbiter can be seen. The processing software digitally removes pure black or pure white pixels from one image and replaces them with the most detailed pixel option from the five other images. This technique can help visualize debris falling during a launch or support research involving intense light sources like rocket engines, plasma experiments and hypersonic vehicle engines. [#]
In mid-2010, Time Magazine showed off a demonstration of a slick tablet app they were making in collaboration with The Wonderfactory. As it became widely shared across the web, HDR photographer Trey Ratcliff of Stuck in Customs started receiving messages from fans who spotted his work in the video demo. Problem was, he had never given the magazine or the agency permission to use his work. Read more…