Exposure remains the most critical aspect of photography.  Digital sensors are more sensitive to light and less forgiving in response to over and under exposure. Digital techniques both during the shooting process and in post production, offer greater opportunities for correcting exposure.

 The most important thing to consider is the camera sensor.  The basic unit of a sensor is what is called the photosite. What it does is collect the light of one pixel and stores it as an electrical charge. A photon excites an electron and this is read as voltage.  The density of a photosite is a measurement called the pixel pitch. This is the distance between the center of one pixel to its neighbor.  You get better resolution with higher pitch pixels. Even the smallest of cameras these days have upwards of 10 million pixels. Some high-end DSLRs have as many as 28-32 million. Small photosites create more noise or distortion so more is not necessarily better but bigger is.  This is a major issue for small cameras.

Light striking the sensor is stored as an electrical charge in each photosite.  As this is in analog form it has to be converted to digital. So the next step involves a digital-to-analog converter. Seeing as all this is monochrome information, a color mosaic filter is placed in front of the camera sensor with a pattern of red, green and blue. Pixels fill up with an electrical charge in proportion to the amount of light they receive thereby responding to light in a linear way. The different values can then be displayed as an image. 

Camera manufacturers have their own method of processing what the sensor sees all the while trying to  produce the most pleasing results.  This RAW data is proprietary to each camera company. Canon names its process CR-2 and Nikon for example goes by the extension of NEF.  Raw  cature is when no conversion is applied. When the setting on a camera is switched fron RAW to jpg, which is generic, the images are compressed. Information that  by jpg calculations, is not pertinent to the final outcome of the image, is thereby discarded resulting in a much smaller file size.

We see light differently than the camera sensor. Our response as well as that of film, is non-linear thus we don't see shadows and highlights cut off abruptly. We see a wider range of tones and more detail.  Larger sensors with larger pixels tend to capture a greater range of tones and the fall-off from highlights to shadow looks more natural. 

Camera manufacturers are always trying to improve their sensors and the processors that drive them.  So far they have done a great job with faster processors, more mega pixels and a wider dynamic range. Be assured that the future has more in store for us.  



Image sensor(92861)