Photographs Captured Over Years with an Open Camera Shutter

German photographer Michael Wesely has spent decades working on techniques for extremely long camera exposures — usually between two to three years. In the mid-1990s, he began using the technique to document urban development over time, capturing years of building projects in single frames. In 1997, he focused his cameras on the rebuilding of Potsdamer Platz in Berlin, and in 2001 he began photographing the Museum of Modern Art’s ambitious renovation project. He uses filters and extremely small apertures to reduce the amount of light striking the film, creating unique images that capture both space and time.

Wesley says that his technique could be used for even longer exposures of ten, twenty, or even forty years. You can learn more about Wesley and view his list of exhibitions over on his website, or purchase the book he created from this Open Shutter project.

Image credits: Photographs by Michael Wesely

  • Fred

    Wow, love that.

  • freeboprich

    Spectacular, I can’t stop looking at them!

  • snem

    A quake will kill your years-long project…

  • Spider- Man

    This guy wins the game of photography!

  • Jeremy McMahan

    I don’t know if it’d kill it… A few stray photons for 5
    minutes wouldn’t ruin a 2-year exposure—as long as the camera came back to
    rest in the same exact spot. Interesting thought exercise to consider the
    ramifications though.

  • MikeAlgar42

    I want to see this in EXTREME LARGE! They are so incredible detailed and mind fuckery. It is a really interesting experiment and documents that urban sprawl well. Especially the idea of reaching towards the sky in big cities and industrialistion. Imagine doing this from this point now to 30 years down the line on a greenbelt somewhere just on the outskirts of a city.

  • Mantis

    Agreed.  I’d love to see large prints of these…

    …hanging inside my own house.

  • Philip Han

    I was contemplating beginning a project based on Hiroshi Sugimoto’s Long Exposures as I do Astrophotography and Time Lapse Photography. I also love my Holga 120 Wide Pinhole Camera! So I decided to do extreme long-exposures and of course I go to PetaPixel and the first article has to be the very same idea I had Hahaha

  • Kesha Webster9

    those are some cool freaking pics

  • stanimir stoyanov

    Does anybody know if the original photographer is selling prints?

  • Dale Bender

    Ghostly-looking!! Try this with a digital camera you new tech droids!!!

  • Freddy K

    It could be done with digital too – you just take one image once a while, eventually align them using a tool like Hugin, and average them all.

    In any case, the main ingredient is patience, technology doesn’t matter so much. With digital, you could also watch the progress, which makes it a bit easier though.

  • Daniel Esquibel

    It’s called Solargraphy & I’ve been a solargrapher for a year & half now. However Michael Wesely is the King of Solargraphy & his method is unknown to almost every solargrapher!

  • Andrew Bowness

    Been a solargrapher for a year and a half? So you’ve taken one or two pics then? Kidding!

    Is it something you build up to (day long exposures, then a week, then a motnh…) I can’t imagine being brave enough to start out with a year-long exposure in case I did something wrong.

  • Al Bartkus

    And 1 minor earthquake or tremor could mess up a year long exposure lol. 

  • Hchord

    Perhaps not.  Considering the size of the aperture, a few hours of jostling would only have a minor effect on the film since the total length of exposure required to capture the image is exponentially larger, by comparison.

  • Jessie Kaufman

    I’m  confused. How did they change the battery?

  • Tyler James Branston

    No battery. shot on film. manually open and close the shutter. 

  • Jes

    This kind of anti technology attitude was prevalent when photography was beginning too, people then believed nothing could replace painting… BTW film or digital sensor would make very little difference (apart from needing a reliable power source to keep the sensor active) as the technique relies on filters and tiny apertures.

  • hostile_17

    But it’s jut not the same without some sepia filter added by an app or something…

  • quickpick

    must have been great fun to invent and execute such an idea, but there’s not that much a multiple exposure and photo editing with layers wouldn’t do.. kudos anyhow for the effort, they are very good looking pictures no matter what!

  • Oj0

    Correct me if I’m wrong, but f/32 and three ten-stop ND filters should allow you to get this? A ten-stop filter lets through 1/1024 the light, so by the second one you’re looking at 1/1,048,576 the light and by the third one it’s 1/1,073,741,824 of the light. Using Sunny 16 and an aperture of f/32 you’re looking at an exposure time 4,294,967,296 times longer than 1/100, or 42,949,672.96 seconds. That’s 497.1 days without accounting for the darkness of night. Unless I’m off, I’m surprised. Before I put any thought into it, I was imagining something more like f/500 with a dozen filters.

  • Mark Penrice

    Pity he started it in the late 90s … could be so much easier to do it now using a regular digital camera. Doesn’t even need to be a good one – if it’s taking one maximum-exposure shot each day, then after 10 years each one will only be contributing the equivalent of a tiny fraction of one bit to the overall total. Or it could take several each day, say one every quarter hour (or more often, if that’s likely to have the sun appearing as overlapping dots rather than a streak) and not even come close to filling a 64GB memory card – let alone an attached HDD with a wireless connection – for the ultimate in HDR…

    The final compositing may take a couple hours to process, but likely no more than that.

  • Mark Penrice

    Pretty easy really. You could leave the thing to run in continual-shoot mode and just stream the data as fast as possible to an attached laptop … which would add the received pixel values into a master matrix, which could later be divided up to provide the final image. At one per second, and 8 bits per subpixel, you could probably manage 40 years easy with a 40-bit integer representation of each subpixel’s total data. 80 years at 30fps and 10-bit colour depth with 48-bit master pixels (and basically unlimited capabilities with 64-bit), so long as your system could handle capturing and transferring at the required data rate… whilst continual backups were being taken in case of equipment failure.

    (Let’s see… a 12-megapixel image, i.e. about 4096 x 3072… with 8-bit subpixels… that’s 36 MB of raw data (ignoring the whole Bayer issue, which might mean it’s actually 12MB anyway)… once per second is 36MB which is a bit too much for USB2, but should be fine with USB3, and that’s being built into smartphones now. Obviously the internals need to be OK with recording at that speed, but the attached computer will have little difficulty with updating the master record in realtime; with 40 bit subpixels it’s 180MB/sec raw rate, or 216MB read (old image + new data), 36 million high-precision integer addition operations, and 180MB/s write. Or if we’re using 64-bit words in memory for convenience, 288MB. A computer of 10 years ago could have handled that. And really, you can separate it out into 256 slides of 32-bit data, covering about two months of imagery each, which reduces the data rate still further. After all that, you take your 9GB of raw data, sum it all together, and divide by whatever factor is necessary to produce either a 36-bit high-colour-depth integer image or, even better, a floating point true-HDR one…

    It’s the high fps that would be more of a problem – if you’re shoving that amount of information around at 30fps, then instead of trying to keep up with approx 1GB/s of raw data and pulling a constant gigaflop of extra-long-int additions, you’re probably going instead to want to try capturing individual photon events as they hit the CCD, through an dark NDF, rather doing a full-frame capture, and build up an image with a somewhat more modest final bitdepth… total photons per cell would still likely be in the millions, if not billions, but 32 bits overall per subchannel might well be ample. Plus if it overflows at any point you just briefly suspend the operation for one or two seconds to divide the existing data by 2, and from then on only count every second photon, sort of like how any other traditional data logger works if it runs out of memory space.)

    And it would actually BE less failure prone, too; instead of coming back after 5 years into a 10 year thing to find a bird had pecked your camera open, or an overzealous cleaner had knocked it over and put it back in the wrong place, etc, ruining the whole mission, you could be alerted straight away if something was wrong with the capture system, and be able to set up and re-align any replacement just-so by using a live image overlay onto the existing record…

  • Mark Penrice

    Don’t forget also that if you’re calculating for daylight film on a sunny summer’s day, there will be cloudy / rainy days and also the shorter, darker ones of winter to deal with. Add it all together, along with a possibly even-smaller pinhole, and a 5- or even 10-year exposure suddenly doesn’t seem too over-the-top.