Our sun emits a wide spectrum of energy (light) which is reflected by objects on the Earth's surface. A remote sensing EO imaging camera from MAPIR can be used to capture this reflected light in the wavelengths that the camera's sensor is sensitive to.
The camera sensors that we currently sell are based on silicon which is sensitive in the visible and near infrared (NIR) spectrum from about 400-1100nm. The chart below shows the relationship between the Sun's emission, a silicon image sensor's sensitivity and our human eye sensitivity.
Using band-pass filters on our cameras that only allow a narrow spectrum/band of light to reach the sensor we can selectively capture the amount of reflectance of objects in that particular band of light. These differences are then used to identify specific materials as well as identify changes in known materials over time.
Below is our popular triple (3) band-pass RGN filter, which allows Red + Green + Near Infrared (NIR) light to pass through. All other wavelengths of light such as blue, orange, yellow, and red-edge are blocked.
The pixels in each of the 3 image channels represent an amount of light captured, with white pixels of higher pixel value capturing more light, and black pixels of lower value less light. For the RGN filter the first image channel captures red light, the second channel captures green light and the third channel captures near infrared (NIR) light. So a white pixel in the first image channel represents a high reflectance of red light in the scene.
Below you can see the 3 image channels from a RGN camera (left to right, Red, Green, NIR), before they have been calibrated:
Here is the same RGN image after calibration for reflectance (Red, Green, NIR):
Pixel values are not percent reflectance without calibration, so we need to create a relationship between pixel values and reflectance using pictures of materials that have been measured for percent reflectance. That is what our reflectance calibration targets are used for.
By capturing an image of our reflectance targets and averaging the pixel values of each of the 4 targets, we create reflectance calibration formulas. These calibration formulas can then be used to convert every pixel in an image into percent reflectance.
The calibration formulas are specific to the camera sensor used, any additional filters installed and the measured reflectance of the reference target materials. They are also specific to the current camera exposure settings, and if the exposure is changed, or the ambient light changes then the calculation needs to be redone with the new changes taken into account.
Since the calibration uses reference targets that have known reflectance, the calibrated results can be compared with other calibrated data that may have been taken in different lighting conditions. That means once calibrated multiple separate datasets can be compared even if they are captured at different times of the day, as well as different locations around the world. Without calibration the images cannot be compared, because they have not been calibrated to a known reflectance reference standard.
In summary, reflectance is the amount of a particular spectrum of light bouncing/reflected off of an object, and in order to measure the amount of reflectance you must use reference targets of known reflectance.
Below you can see an image (left) that is not calibrated for reflectance and the calibrated one (right). Notice the NDVI values are all too low and incorrect for the non-calibrated image. Always calibrate your images otherwise the pixel data will not be useful.