Image processing uses techniques that can identify shades, colors and
relationships that cannot be perceived by the human eye. Image processing is
used to solve identification problems, such as in forensic medicine or in
creating weather maps from satellite pictures. It deals with images in bitmapped
graphics format that have been scanned in or captured with digital cameras.
The photodiodes employed in an image sensor are color-blind by nature: they can only record shades of grey. To get
color into the picture, they are covered with different color filters: red, green and blue (RGB) according to the pattern designated by the Bayer filter - named after its inventor. As each photodiode records the
color information for exactly one pixel of the image, without an image processor there would be a green pixel next to each red and blue pixel. (Actually, with most sensors there are two green for each blue and red diodes.)
The image processing engine comprises a combination of hardware processors and software algorithms. The image processor gathers the luminance and chrominance information from the individual pixels and uses it to compute/interpolate the correct color and brightness values for each pixel. If it does this well, the result is an image with natural and pleasing colors, balanced contrast and appropriate sharpness.
This process, however, is quite complex and involves a number of different operations. Its success depends largely on the "intelligence" of the algorithms applied.
As stated above, the image processor evaluates the color and brightness data of a given pixel, compares them with the data from
neighboring pixels and then uses a demosaicing algorithm to produce and appropriate
color and brightness value for the pixel. The image processor also assesses the whole picture to guess at the correct distribution of contrast. By adjusting the gamma value (heightening or lowering the contrast range of an image's mid-tones) subtle tonal gradations, such as in human skin or the blue of the sky, become much more realistic.
Noise is a phenomenon found in any electronic circuitry. In digital photography its effect is often visible as random spots of obviously wrong
color in an otherwise smoothly-colored area. Noise increases with temperature and exposure times. When higher ISO settings are chosen the electronic signal in the
image sensor is amplified, which at the same time increases the noise level, leading to a lower signal-to-noise ratio. The image processor attempts to separate the noise from the image information and to remove it. This can be quite a challenge, as the image may contain areas with fine textures which, if treated as noise, may lose some of their definition.
As the color and brightness values for each pixel are interpolated some image softening is applied to even out any fuzziness that has occurred. To preserve the impression of depth, clarity and fine details, the image processor must sharpen edges and contours. It therefore must detect edges correctly and reproduce them smoothly and without over-sharpening.
With the ever higher pixel count in image sensors, the image processor's speed becomes more critical: photographers don't want to wait for the camera's image processor to complete its job before they can carry on shooting - they don't even want to notice some processing is going on inside the camera. Therefore, image processors must be optimized to cope with more data in the same or even a shorter period of time.
Individual manufacturers have named their image processing engines differently: Canon's is called
DiG!C, Nikon's is EXPEED, Olympus'
TruePic, and Panasonic's the VENUS Engine.
Digital image processing
Digital image processing is the use of computer algorithms to perform image processing on digital images. As a subfield of digital signal processing, digital image processing has many advantages over analog image processing; it allows a much wider range of algorithms to be applied to the input data, and can avoid problems such as the build-up of noise and signal distortion during processing.
Nikon’s 16 bit image processor called EXPEED has some interesting features, such as real time
chromatic aberration correction and Active D-Lighting. D-lighting
technology can “light” the dark parts of the image by using a filter. This way the system will lighten pixels in depends to where the pixel is in the image.
Also there is the new “Picture Control”, a unified color system for all Nikon Cameras. There are Standard, Neutral, Vivid and Monochrome Picture Control settings. Each can be tuned by the user and could be transferred to other cameras. This still
does not mean Nikon uses color LUT that could be uploaded to the cameras as Canon does (And all of this is worthless if you use non Nikon of Canon Raw software). Nikon claims EXPEED is capable to produce the same image quality as Nikon Capture NX, which normally does produce excellent image quality - far better than the Camera.
TruePic is the name Olympus has given its image processing engine. The second version was named TruePic TURBO. In 2007 Olympus began to equip its digital cameras, both
compact and D-SLRs with the latest version, TruePic II.
With this engine, its ability to reproduce colors naturally was improved. For this, the Advanced Proper Gamma III technology was enhanced to independently control luminance and chrominance difference signals for more faithful reproduction of pale colors. Also, individual colors can be corrected without affecting the reproduction of other colors. Color reproduction was fine-tuned so that they are not just correct but also appear pleasing to the human eye. As a result, human skin colors and the blue of the sky can be reproduced more faithfully.
The Advanced Noise Filter III contributes to a high-quality reproduction of images through the reduction of noise by isolating the image and noise signals more accurately. It replaces the real space (real image) with a frequency space and extracts the signal component. It then smoothes out the signal components while preserving the edges.
To reproduce edges smoothly but still sharply, the Advanced Detail Reproduction technology detects edge direction and applies a Low Pass Filter (LPF) in the edge direction and a High Pass Filter (HPF) in the edges normal line direction. This way edges become smooth and false colors are eliminated.
The new engine also provides a speed improvement so 3fps image sequences are possible even with a 10 megapixel resolution.
Digital camera images
Digital cameras generally include dedicated digital image processing chips to convert the raw data from the image sensor into a color-corrected image in a standard image file format. Images from digital cameras often receive further processing to improve their quality, a distinct advantage digital cameras have over film cameras. The digital image processing is typically done by special software programs that can manipulate the images in many ways.
Many digital cameras also enable viewing of histograms of images, as an aid for the
photographer to better understand the rendered brightness range of each shot.
- Image processing
- Digital image processing
- Digital image editing