But what gear is behind those intriguing images we see so frequently? NASA JPL has put together a short video on the camera equipment on board the Curiosity rover.
Seventeen cameras (the most of any NASA planetary mission) on the rover are responsible for taking both color and black and white images, depending on their role.
At the end of the arm is a camera that takes “high-resolution” color images. Another camera on the mast does geology work. Four cameras on the front and rear of the vehicle (dubbed HazCams) identify potential driving hazards.
There are more, each with a specific function. But why are some cameras capable of shooting color images and others not? According to Justin Maki, who leads the engineering camera team for the rover, there are black and white cameras because “that’s all the rover really needs in order to detect rocks and other obstacles.”
As far as color cameras go, Maki says “scientists use the color information to learn about the soil and the rocks.”
But when NASA says “high resolution,” it’s really anything but (at least by today’s standards). Cameras on the rover can take between 1 and 2-megapixel images. It’s 2013 — why on Earth is a super-expensive space project not taking higher resolution images?
Consider the fact that the planning for the Curiosity mission started earlier in the decade. The sensors used, at the time, were considered to be cutting-edge. And if you think your ISP’s bandwidth caps are ridiculous, get this:
According to Extreme Tech, daily bandwidth transmissions are limited to about 32MB! This is also another reason why the rover doesn’t usually send back video clips — despite the fact it’s very much capable of doing so.