Many factors must be considered when choosing a thermal camera lenses for a thermal imaging camera. These include knowledge of the technology being used and of the intended application for the imager. It is also essential to have a good idea of what results must be achieved in light of the price range budgeted for a new thermal camera lenses (the entire lens assembly mounted to the camera). Knowing the critical aspects of thermal camera lenses facilitates the selection process.
There are three main regions, or wavebands, of sensitivity most common with today’s IR cameras. The first is the near- or short-wave IR, which spans approximately 0.9 to 2.5 µm. Next is midwave IR, roughly from 3 to 5 µm, and finally long-wave IR, from about 8 to 12 µm. Some IR cameras can work outside of these areas, but they are generally optimized for them.
These thermal camera lenses are, in fact, designed to operate over a set waveband, whether it is one of the bands listed above or more than one band; e.g., 1.5 to 5 or 3 to 12 µm. When designed for a particular waveband, many factors influence their performance, including material selection, lens thickness, air spacing, surface curvatures and coatings.
This said, an IR camera, for example, may be optimized as a long-wave device but have some low level of sensitivity down to 3 µm. For a particular application, there may be a requirement to use this camera’s full range of sensitivity. If a thermal camera lenses is designed for the long-wave IR but is merely coated differently to achieve maximum transmission over the complete range, the image quality would be very poor below 8 µm. A thermal camera lenses would have to be designed specifically to perform over the full range of 3 to 12 µm by taking the factors listed above into account to ensure image quality.
