eLynx SDK
v3.3.0 C++ image processing API reference |
Signal is the brightness in your image that is a direct result of photons that have traveled from some object in space to your CCD detector. In an ideal world, one could count these photons one at a time and get an exact result, generating nothing but a signal. But in the real world, there are a variety of noise sources to contend with. Noise create uncertainty in the actual value of the image. To improve the signal level in your image, you can simply use longer exposures. Signal increases faster than noise, so longer exposures (within the limits of your equipment) are going to give you better images. For example, if the signal level is 100 units, and the noise level gives an uncertainty of +/- 5 units, then the signal is ten times larger than the noise level. If the signal level is increased to 1000 units by using an exposure that is ten times longer, and the noise level gives an uncertainty of +/- 20 units, then the signal level is now 50 times larger than the noise level. This is why most CCD imagers are always looking to refine their imaging, guiding, and mount tuning skills to get longer exposures. The length of your exposure is also limited by the saturation level of your chip, however.
Noise is the brightness in your image that is a direct result of anything other than the photons that have traveled from some object in space to your CCD detector. Noise can occur from photons during light collection, or it can occur from electrons after the photons are converted to electrons. It can also occur when you process your image. Noise creates uncertainty in the brightness level of the pixels in your image. If a picture has a lot of noise in it, you can see this visually. There will be variations in brightness in areas where you expect to see a more uniform brightness. Words commonly used to describe this: `grainy` and `gritty.` Technically, noise is random and unpredictable. That's what makes it noise: there is no way to predict it or remove it from the image. We also use the word noise to refer to things that are repeatable and therefore be removed, such as dark current (pixel values that accumulate over time, with or without photons striking the detector) and bias (pixel values that exist even in the shortest possible exposure). Measuring the amount of noise is a job for a statistician; the math will be daunting for all but the mathematically hardy.
Some common sources of noise in CCD imaging include:
See articles : Concept and types, Examples and Characteristics.
Commonly noted S/N or SNR, this is simply a ratio of the signal in your image to the noise in your image. If the signal is five times larger than the noise, then you have a signal to noise ratio of 5. If the signal is 1000, and the noise is +/- 25 (total of 50), then the signal-to-noise ratio is 20. As you might expect, this ratio varies for every pixel in the image. Actually measuring the signal and noise in your image so that you can quantify the signal-to-noise ratio is a bit complex, however, both in practical terms and in mathematical terms. The simplest way to think of the noise in your image is that it is the uncertainty in the brightness level. If 100 photons arrive at a certain pixel, we might be uncertain whether the real value is 90 or 110. The uncertainty is the result of the various sources of noise. To find the level of noise, you must take many images of the same duration and compare the results statistically. Such a statistical analysis provides a measure of the noise, and allows you to quantify the quality of your images with a S/N ratio.
See articles : The Signal-to-Noise Connection by Mike Newberry.
This is the lower limit of data displayed. All pixels dimmer than the black point will appear black on your display. The best place for the black point is on the left side of the main data peak in the image histogram. If the main data peak has more than one peak, your image probably suffers from gardients and no single black point will work. Fix the gradients before you try to set an accurate, final black point.
A histogram is a map of the brightness levels in an image. It's a graph that shows increasing numbers of pixels in the vertical direction, and increasing brightness in the horizontal direction. A high peak to the left means there are lots of dark pixels in the image. This situation is typical of an astronomical photo. There are two types of histogram changes you typically make when processing an image:
See articles :
See articles : Statistics