Spectral Edge RGB+IR Fusion technology (or Spectral Edge Fusion), which will be formally launched into the professional security market at ISC West next month, enables surveillance cameras to capture high resolution, colour accurate images, even in very low or mixed lighting conditions. Spectral Edge Fusion (Fusion) even delivers detailed images in foggy, misty or hazy conditions which traditional cameras would be unable to pick out.
Spectral Edge Fusion achieves this by intelligently-balancing Red Green Blue (RGB) and Near-Infrared (NIR) light in real-time. It’s the only technology on the market today that can engineer RGB and NIR light fusion without introducing pixilation, artefacts or additional noise.
Fusion works by taking four channels – Red, Green, Blue and NIR light – and reducing these to three by fusing the NIR light into the RGB, creating very smooth transitions as natural light levels reduce below levels where fine detail would ordinarily be lost. The technology optimises the illumination that is present, even in the lowest light levels well below one lux.
The power of Spectral Edge Fusion is in the way it fuses and tunes RGB and NIR data, processing these light sources together within the same image. It ensures minimal loss of resolution, while delivering improved contrast, dynamic range and signal-to-noise ratio. As a result, what’s lost in colour as daylight fades, is made up for in additional texture and depth of the image which NIR lighting provides.
Camera build saving
Fusion also generates a build saving for camera manufacturers because only one sensor is needed for day/night cameras and no mechanical switching between sensors is required. It removes the need for an IR-cut filter, for example, further helping minimise moving parts which wear out over time and extending surveillance cameras’ Mean Time to Failure.
An additional cost advantage flows from using NIR because it can be included on the same semiconductor process as an RGB sensor. Thus, NIR can be combined on the same sensor chip as RGB with no extra cost at all if some of the pixels which sense visible light are replaced with pixels sensitive to NIR. Sensors which support both RGB and NIR wavelengths exist already: the On Semiconductor AR0237, the OmniVision OV4686 and the HiMax HM2143, to name three.
Supports accuracy of a wide range of person, event and action recognition applications
Spectral Edge Fusion optimises existing video content analytics applications such as motion detection, object detection, tripwire and facial recognition. It is being integrated into a number of surveillance-ready chipsets compatible with both 2×2 and 4×4 Bayer Pattern RGB+IR CMOS sensors, to improve image accuracy and performance for facial recognition and other security applications.
In tests, Fusion proved highly effective in identifying objects such as vehicles’ license plates and colours, and more easily spotting and flagging ‘security incidents’ such as a pedestrian crossing a railway track or motorway in low light.
Improved facial recognition
Fusion can not only generate high quality images capable of reliable identification of an individual in low light conditions but also when there is ‘mixed lighting’, over-exposure and under-exposure as seen below. As many as 22 of the 25 images taken from a bus camera would not be high enough quality to enable automated positive identification using facial recognition software.
However, by combining RGB and NIR light to additional detail and depth it would be possible to get a reliable, consistent image capable of identifying an individual accurately in more than 90 per cent of these cases.
Now that Spectral Edge Fusion software can be integrated into existing chipsets without increasing the silicon area significantly, Spectral Edge has turned its attention to developing chip vendor partnerships. It is developing a prototype camera reference design with RGB+IR fusion capability built into it and is already talking to camera vendors about conducting field trials using this prototype in challenging, real-world conditions.