eLynx SDK
v3.3.0 C++ image processing API reference |
Stands for Charge-Coupled Device. Charge-Coupled refers to the fact that the pixels on a CCD chip are electrically connected to each other. This allows the chip to be read by shifting the data, one row at a time, toward a read register. The primary advantage of this fact is that it avoids circuitry blocking the incoming light, allowing maximum sensitivity to incoming photons.
Bit depth is a computer graphics term describing the number of bits used to represent a color or pixel in a bitmapped image. This concept is also known as bits per pixel also noted bpp, particularly when specified along with the number of bits used. Higher color depth gives a broader range of distinct colors.
Commonly bit depth used by image file (JPEG) and monitor are 8-bit per channel so 24-bit for RGB pixel. In astronomy imaging we use to used 12-bit per channel (RGB=36-bit) for acquisition and at least 16-bit for processing (RGB=48-bit).
See articles : bit-depth at Wikipedia, bit-depth tutorial.
The dynamic range of a sensor is defined by the largest possible signal divided by the smallest possible signal it can generate. The largest possible signal is directly proportional to the full well capacity of the pixel. The lowest signal is the noise level when the sensor is not exposed to any light, also called the "noise floor".
See articles : Dynamic range of a sensor, Banding in low colour resolution workflows.
A raw image file contains minimally processed data from the image sensor of a digital camera. Raw files are so named because they are not yet processed and ready to use by a bitmap graphics editor, displayed by a typical web browser. The image must be processed and converted to an RGB format such as TIFF or fits before it can be manipulated.
Most camera raw files have a bit depth of 12 or 14 bits per channel instead of the 8 used by JPEG. Camera raw files are 2-6 times larger than JPEG files. Some raw formats do not use compression, some implement lossless data compression to reduce the size of the files without affecting image quality and others use lossy data compression where quantization and filtering is performed on the image data.
Raw file benefits are high dynamic range and lossless datas the price is huge memory cost.
See articles : Why should I use RAW?, Dave Coffin's decoder.
A Bayer filter is a color filtering array (CFA) for arranging RGB color. The rgb color model utilizes the additive model in which red, green, and blue light are combined in various ways to create other colors.
In the raw output of Bayer filter cameras is referred to as a Bayer Pattern image. Two thirds of the color data is missing from each pixel. A demosaicing algorithm is used to interpolate the complete red, green, and blue values for each point. Many different algorithms exist.
The term derives from the name of its inventor, Bryce Bayer of Eastman Kodak.
See articles : RGB "Bayer" Color and MicroLenses.
This is the operation that convert raw encoded file with Bayer filter to RGB space. It exists various algorithms for demosaicing = calculates the values of missing pixels :
See articles : Demosaicing at Wikipedia, Digital camera sensors,
A study of spatial color interpolation algorithms for single-detector digital cameras by Ting Chen.
The Foveon X3 sensor is an image sensor for digital cameras produced by Foveon, Inc. It is a layered sensor design, in which each location in a grid has layered photosensors sensitive to all three primary colors, in contrast to the mosaic Bayer filter sensor design commonly used in digital camera sensors where each location is a single photosensor (pixel) sensitive to only one primary color.
See articles : Foveon at Wikipedia, X3 pixel page.
It all boils down to the concept of color temperature. Color temperature is a way of measuring the quality of a light source. It is based on the ratio of the amount of blue light to the amount of red light, and the green light is ignored. The unit for measuring this ratio is in degree Kelvin (K). A light with higher color temperature (i.e., larger Kelvin value) has "more" blue lights than a light with lower color temperature (i.e., smaller Kelvin value). Thus, a cooler (resp., warmer) light has a higher (resp., lower) color temperature.
Setting white balance incorrectly may cause a color shift in the image. For example, suppose the camera is told to use a color temperature of sunlight to take an image of an indoor environment illuminated mainly by incandescent lights. The camera will expect excessive blue light and less red light, and set its algorithm to be more sensitive to the blue light. However, in an environment illuminated with incandescent lights, color temperature is low with excessive red light rather than the blue one. As a result, we shall see a reddish or yellowish image.
See articles : White balance at Wikipedia, White balance tutorial, What is white balance, Are you RGB Savvy?
Pixel binning is a clocking scheme used to combine the charge collected by several adjacent CCD pixels, and is designed to reduce noise and improve the signal-to-noise ratio and frame rate of digital cameras. The binning process is performed by on-chip timing circuitry that assumes control of the serial and parallel shift registers prior to amplification of the CCD analog signal.
See articles : Binning, Pixel binning;
Exposure is the amount of light which falls upon the sensor of a digital camera. If there is too much light, a photo becomes over-exposed, weak and washed out looking. When light is insufficient, images or areas within them are dark and under-exposed.
See articles : Controlling Exposure Using Histograms.
This is a catch-all term that refers to all kinds of optical problems. The greater the aberration in an optic, the worse the sharpness and contrast will be. The higher the quality of the optic, the fewer and smaller the aberrations is has.
Aberration of light: an astronomical phenomenon which produces an apparent motion of celestial objects.
Aberration in optical systems: an effect that causes light from one point of an object after transmission through the system to arrive at different points. Chromatic aberration or "color fringing" is caused by the camera lens not focusing different wavelengths of light onto the exact same focal plane (the focal length for different wavelengths is different) and/or by the lens magnifying different wavelengths differently. These types of chromatic aberration are referred to as "Longitudinal Chromatic Aberration" and "Lateral Chromatic Aberration" respectively and can occur concurrently. The amount of chromatic aberration depends on the dispersion of the glass.
The state or act of aligning the elements of an optical system. To get the best images from your telescope, the mirrors and/or lenses should be perfectly aligned. Misalignment will create various types of optical aberrations, and reduce the sharpness and contrast of your images. Poor collimaton can also result in elongated star images. How you collimate depends on what type of telescope you have.
See article : How to focus a CCD camera.
Depending on the elevation of an object in the sky, the atmosphere absorbs different amounts of color in the light passing through. At a low elevation, blue extinction is pronounced. You see this when the sun sets instead of being a bright, yellow/white object, it becomes noticeably ruddy. The same thing is happening at night. The lower an object is, the greater the blue exinction. You can compensate for this by adjusting the duration of your blue exposure: increase it for objects low to the horizon.
A pixel on a digital camera sensor collects photons which are converted into an electrical charge by its photodiode. As explained in the dynamic range topic, once the "bucket" is full, the charge caused by additional photons will overflow and have no effect on the pixel value, resulting in a clipped or overexposed pixel value. Blooming occurs when this charge flows over to surrounding pixels, brightening or overexposing them in the process. In the example below, the charge overflow of the overexposed pixels in the sky causes the dark pixels at the edges of the leaves and branches to be brightened and overexposed as well. As a result detail is lost. Blooming can also increase the visibility of purple fringing.
See articles : Blooming, Blooming vs. Anti-Blooming.
Hot pixels are part of the CCD reality. Also referred to as "stuck pixels" or "hot pixels" noise, occurs with long exposures, 1-2 seconds or more, and appears as a pattern of colored dots, slightly larger than a single pixel. These pixels are never the same and each of them has different charge leakage rate. It depends on a temperature as well - remember, more heat, the electrons are vibrating faster - more of them will escape from wherever they are - more charge leakage.
See articles : Hot pixels, Night Time Shots and Hot Pixels.
This is the most used webcam (low cost+quality) for astronomy.