What Airplanes Look Like to Google Map Satellite Cameras

Apparently airplanes travel a little too fast for the satellites that provide photos for Google Maps. It happened to capture this plane shooting across the sky over Hyde Park in Chicago, but separated the plane into a phantom plane and three RGB shadows. Anyone have an explanation for what caused this phenomenon?

map: hyde park Chicago IL (via Photoxels)

  • Konstantin

    Looks like a Multi-Pass RGB camera, possibly a B&W sensor with colored filters. The leading image loos like a digital High Pass filter in Overlay mode (Photoshop), something like the Clarity slider in Lightroom.

  • Anonymous

    They probably capture each color channel separately and then combine them. Someone could probably calculate the amount of time between each frame capture from the plane’s “RGB shadows”!

  • Chicago Joe


  • Constantine Thomas

    It’s most likely because the colour image is made up of greyscale images taken through three different (red, green, and blue) filters. Since the plane’s moving as the pictures are taken, when they are combined to make a colour image, you’ll see the component RGB images.

    Though I’m not sure why there’s a combined image at the front of the sequence here. Maybe there’s another filter involved in this case too (possibly a “Clear” filter?).

  • Anon

    The satellite captures each channel separately.

    Assuming 200km/h (plane looks low and close to landing but I could be wrong) each channel looks about 10m apart or so. That means that the camera takes each channel 1/5s apart, which does not seem unreasonable.

  • oi99uk

    I think satellite imaging systems use precisely calibrated optical filters to create full colour images, so the actual full colour photo is probably made up of three colour and one luminosity channel that are combined. It takes fractions of a second to acquire each separate channel which does not cause perciptible blur for static objects (the ground) but will be apparent as it freezes the moment of an aircraft moving at several hundred miles per hour as each filter comes into play.

  • -b-

    It’s not unusual for satellites to use a single row of pixels per color, and build up an image by reading these pixels repeatedly as the satellite moves over the landscape. This is often called “pushbroom imaging”. See for example, describing how the HiRISE camera works at Mars.

  • RichD

    The first explanation that comes to mind is the on-board camera may be a “scanning” type, that uses a monochrome sensor shooting through filters to capture the Red Green and Blue channels, and possibly and IR channel as well, and then composites them together. There would be a time lag between each phase of the exposure, while the filter is changed and the data from each pass is streamed off of the sensor.

    That would explain why a relatively fast-moving object like an airplane would ghost like this – it’s position has changed between each phase of the exposure, so when they’re composited, each of the sub-frames shows it in a slightly different position.

    Given the same exposure time with a full-color CCD, that movement would simply show up as a blurred object.

    Given the dollars-per-pound limitations of putting things in orbit, it may be that the circuitry was simpler and or lighter to use a monochrome sensor with filters rather than a full-color sensor, or perhaps the satellite was intended for multiple uses, such as infra-red and ultra-violet imaging, in addition to “full color” photos, in which case a monochrome sensor with filters is definitely lighter and less expensive to produce and put in orbit than one with half a dozen or more separate sensors.

  • IanWorthington

    Might one of those channels be IR?

  • Chris Penner

    I think the idea is right, even if your estimations are slightly off. That airplane’s wing is not going to be 10 m wide at the tip. Probably closer to 2 m, if even that.

  • Anon

    I’m surprised that nobody blamed the ‘terrorists’…yet

  • Adam

    Check out this site for more images with RGB “shadows”:—Civilian-(in-flight).htm

  • Michael WM

    It’s called pansharpening, and it’s a very standard technique for color cameras in orbit.

  • pete

    I am not even sure this is not fake… This plane seems to be heading to the airport, L22 but it also seems the runway is being used R22 the shadows on the airport are same as in park.
    This may all be real, but I am not sure. Wings do not really show if he has his air brakes on..??

  • Bob


  • Ross

    I reckon this is the closest to the answer.

    Monochrome sensor, switchable RGB filters over the whole sensor (rather than over alternate pixels, as on most digicams’ Bayer filter sensors), then a final filterless frame for pure luminosity data.

  • nothingpersonal


  • Constantine Thomas

    It’s the norm for spacecraft – none of the interplanetary probes (Voyager, Viking, Cassini, Galileo etc) have colour cameras – they all send back greyscale images through filters of various wavelengths.

  • bluedeviloptics

    They probably captured the imagery with line sensors, (to get very high-resolution imagery you can get very high pixel count linear arrays >16,000 pixels). To get color they have multiple line sensors, each with a different color filter. Since they’re separated in space, the color channels see different regions of space. This is compensated for, but only for ground objects that aren’t moving. When things are moving fast, you’d get such artifacts.Prism-based systems don’t have these issues (they separate one line of light).

  • Brian Turner

    Like Konstantin I believe it is a B&W camera with colored filters. It takes four images in quick succession capturing blue, followed by green, then red, and finally no filter. You can easily reproduce this process by following the steps in the linked article.

  • Ultradiv

    You’re all wrong! It’s not an airplane but an aeroplane.

    Waiter waiter! There’s a fly in my soup!
    It’s not soup sir, it’s broth!

  • Herr Olsen
  • Josh Rovero

    IMHO, his is an aerial photography shot from an aircraft, not a satellite. Resolution looks much better than 1 meter. Not all Google imagery is from satellites…

  • Photosophy

    Ta da!

    Oh wait…that’s the Google car not satellite.

  • Samat Jain

    A note: the higher-resolution Google “satellite” imagery isn’t done by satellite — it’s aerial photography, done high enough to minimize perspective distortion.

  • dave

    that has to be a rainbow unicorn. flying one.

  • bubbledumpster


  • Guest

    Doppler effect

  • Guest.

    I doubt that’s possible without knowing the plane’s velocity.

  • Anonymous

    Planes have standard speeds

  • Cherrykot

    I found two while doing some research. The first one I saw in L.A. I thought it was a plane on the freeway ramp. I didn’t know what to think really.

    images on Google Earth, Nov 14, 2009
    34.02821 -118.21108, Los Angeles, CA, Pomona Fwy, I-10, I-5

    images on Google Earth, August 29, 2010
    39.39704 -76.43264, Perry Hall, MD, I-95

  • Guest

    Yes, they have standard speeds, but if the plane was taking off or landing, then it would be very hard to guess the speed.

  • Jim

    It’s a cloaking device for planes, but Google sees everything.

  • Bryn

    Planes in class B airspace and under 10000 ft. are limited to 250 knots (~290 mph), with in 4 nautical miles of the airport 200 knots. On final approach many airliners are still doing 150-180 knots. You could calculate the altitude by length or wingspan of plane relative to tennis court length

  • Bernie

    Sweet analysis. The only thing you still need to know is the standard height of a Google camera. 

  • Bernie

    Sweet analysis. The only thing you still need to know is the standard height of a Google camera. 

  • Bernie

    Sweet analysis. The only thing you still need to know is the standard height of a Google camera. 

  • Shiraz11

    Checkout the plane here – london post code WC1B 5BE

  • Boom

    The plane is obviously at cruising altitude

  • Patrick Staley

    Found the same thing if you look at the corner of Southport and Belden on Chicago

  • ooop

    lool i found them

  • Trevor Hutchins

    The satellite takes four snapshots, one with the Blue sensor, another with the Green sensor, another with the Red sensor, then a final with the alpha/luminosity sensor. It combines them to create one solid, colorized image. This is all done in a split second, but when an object is moving several meters a second, this is how it turns out.

  • Ben Keller

    Chromatic distortion near the edge of the lens.

  • jtan163

    Possibly chromatic abberation?

    My understanding is that chromatic aberation is the effect of different wavelengths (colours) of light moving at slightly different speeds, and not being re-focussed back.
    If you get that over a few hundred meters, with your camera, imagine what it might be like over say 30km or more (not sure how high a satellite would be above a plane)>

    It might be different sensors for each RGB channel, but that would mean the google cmaeras weren’t RGB, they’d be RGBY. There’s a yellow ghost too. Oh and Cyan.