Light Detection and Ranging (LiDAR) has evolved from a niche meteorological tool into the "eyes" of autonomous vehicles, industrial robots, and precision mapping drones. At its core, a LiDAR system measures distance by emitting a pulse of light and timing how long it takes to reflect off an object and return to a sensor.
However, the effectiveness of this system is entirely dependent on the quality and integration of its optical components. An optimized optical train ensures that the laser pulse is focused, the return signal is captured efficiently, and ambient noise (like sunlight) is filtered out. In this guide, we will break down the critical components required to build a high-performance LiDAR system.
The choice of laser dictates the entire optical design. Most modern LiDAR systems operate at either 905 nm or 1550 nm wavelengths.
Beyond wavelength, you must consider the beam divergence and pulse width. A narrower pulse width improves ranging accuracy (depth resolution), while low beam divergence ensures the light stays focused over long distances.
Once the light leaves the laser source, it must be shaped. Raw semiconductor laser beams are often highly divergent and elliptical. To fix this, engineers use collimating lenses.
Aspheric Lenses are the gold standard for LiDAR collimation. Unlike standard spherical lenses, aspheres are designed to eliminate spherical aberration, providing a much cleaner, tighter beam. This is crucial for long-range applications where even a small amount of scatter can lead to significant energy loss at the target.
In more advanced systems, Diffractive Optical Elements (DOEs) or Microlens Arrays are used to transform a single beam into a "flash" of light or a specific structured pattern. This is common in "Flash LiDAR" systems that capture an entire field of view at once without moving parts.
The receiving side of a LiDAR system is arguably more difficult to design than the transmitter. The reflected light coming back from a distant object is incredibly weak—often just a few photons.
The goal of receiving optics is to maximize the effective aperture while maintaining a wide enough field of view (FOV). High-quality objective lenses with large diameters are used to gather as much light as possible. These lenses must be coupled with Avalanche Photodiodes (APDs) or Silicon Photomultipliers (SiPMs) which provide the internal gain necessary to detect those faint returns.
In outdoor LiDAR applications, sunlight is the primary source of noise. Without proper filtering, the detector would be overwhelmed by solar radiation.
Narrow Bandpass Filters are integrated into the receiver path. These filters are engineered with thin-film coatings that only allow the specific laser wavelength (e.g., exactly 1550 nm) to pass through, blocking out everything else.
Additionally, Anti-Reflective (AR) Coatings are applied to every glass surface in the system. Each uncoated glass-to-air interface reflects about 4% of the light. In a multi-lens system, these losses add up quickly, reducing the range and accuracy of the LiDAR unit.
Selecting the components is only half the battle; integrating them into a robust housing is where the engineering rigor truly begins.
Q: Why is 1550 nm preferred for long-range LiDAR?
A: The human eye's cornea and lens absorb 1550 nm light before it reaches the retina, making it much safer. This allows manufacturers to use higher power levels, which translates to a longer detection range (often over 200 meters).
Q: Can I use plastic lenses for LiDAR?
A: While plastic (polymer) lenses are lightweight and cheap, they often suffer from thermal instability and lower optical clarity. For high-precision automotive or industrial LiDAR, glass or fused silica is generally preferred.
Q: What does "Boresighting" mean in LiDAR?
A: Boresighting is the process of aligning the optical axis of the laser transmitter with the optical axis of the receiver to ensure they are perfectly parallel.
905nm Laser Diode for LiDAR
View on AmazonLaser Collimating Lens
View on AmazonShare this guide: