Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Lidar
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Components == [[File:LIDAR-scanned-SICK-LMS-animation.gif|thumb|A basic lidar system involves a laser range finder reflected by a rotating mirror (top). The laser is scanned around the scene being digitized, in one or two dimensions (middle), gathering distance measurements at specified angle intervals (bottom).]] {{more citations needed section|date=April 2017}} === Laser === 600–1,000 [[nanometre|nm]] [[laser]]s are most common for non-scientific applications. The maximum power of the laser is limited, or an automatic shut-off system which turns the laser off at specific altitudes is used in order to make it eye-safe for the people on the ground. One common alternative, 1,550 nm lasers, are eye-safe at relatively high power levels since this wavelength is not strongly absorbed by the eye. A trade-off though is that current detector technology is less advanced, so these wavelengths are generally used at longer ranges with lower accuracies. They are also used for military applications because 1,550 nm is not visible in [[Night vision device|night vision goggles]], unlike the shorter 1,000 nm infrared laser. Airborne topographic mapping lidars generally use 1,064 nm diode-pumped [[Yttrium aluminium garnet|YAG]] lasers, while [[bathymetric]] (underwater depth research) systems generally use 532 nm [[Second-harmonic generation|frequency-doubled]] diode pumped YAG lasers because 532 nm penetrates water with much less [[attenuation]] than 1,064 nm. Laser settings include the laser repetition rate (which controls the data collection speed). Pulse length is generally an attribute of the laser cavity length, the number of passes required through the gain material (YAG, [[YLF]], etc.), and [[Q-switch]] (pulsing) speed. Better target resolution is achieved with shorter pulses, provided the lidar receiver detectors and electronics have sufficient bandwidth.<ref name="cracknell" /> A [[phased array]] can illuminate any direction by using a microscopic array of individual antennas. Controlling the timing (phase) of each antenna steers a cohesive signal in a specific direction. Phased arrays have been used in radar since the 1940s. On the order of a million optical antennas are used to see a radiation pattern of a certain size in a certain direction. To achieve this the phase of each individual antenna (emitter) are precisely controlled. It is very difficult, if possible at all, to use the same technique in a lidar. The main problems are that all individual emitters must be coherent (technically coming from the same "master" oscillator or laser source), have dimensions about the wavelength of the emitted light (1 micron range) to act as a point source with their phases being controlled with high accuracy. [[MEMS|Microelectromechanical mirrors (MEMS)]] are not entirely solid-state. However, their tiny form factor provides many of the same cost benefits. A single laser is directed to a single mirror that can be reoriented to view any part of the target field. The mirror spins at a rapid rate. However, MEMS systems generally operate in a single plane (left to right). To add a second dimension generally requires a second mirror that moves up and down. Alternatively, another laser can hit the same mirror from another angle. MEMS systems can be disrupted by shock/vibration and may require repeated calibration.<ref name=":11">{{Cite news |last=Mokey |first=Nick |date=2018-03-15 |title=A self-driving car in every driveway? Solid-state lidar is the key |work=Digital Trends |url=https://www.digitaltrends.com/cars/solid-state-lidar-for-self-driving-cars/ |access-date=2018-06-15}}</ref> === Scanner and optics === Image development speed is affected by the speed at which they are scanned. Options to scan the [[azimuth]] and elevation include dual oscillating plane mirrors, a combination with a polygon mirror, and a [[Laser scanning|dual axis scanner]]. Optic choices affect the angular resolution and range that can be detected. A hole mirror or a [[beam splitter]] are options to collect a return signal. === Photodetector and receiver electronics === Two main [[photodetector]] technologies are used in lidar: [[Solid-state electronics|solid-state]] photodetectors, such as silicon avalanche [[photodiode]]s, or [[photomultiplier]]s. The sensitivity of the receiver is another parameter that has to be balanced in a lidar design. === Position and navigation systems === Lidar sensors mounted on mobile platforms such as airplanes or satellites require instrumentation to determine the absolute position and orientation of the sensor. Such devices generally include a [[Global Positioning System]] receiver and an [[inertial measurement unit]] (IMU). === Sensor === Lidar uses active sensors that supply their own illumination source. The energy source hits objects and the reflected energy is detected and measured by sensors. Distance to the object is determined by recording the time between transmitted and backscattered pulses and by using the speed of light to calculate the distance traveled.<ref name="NASA">{{Cite web|url=https://earthdata.nasa.gov/user-resources/remote-sensors#hyperspectral|title=Remote Sensors {{!}} Earthdata|website=earthdata.nasa.gov|access-date=2017-03-18}} {{PD-notice}}</ref> Flash lidar allows for 3-D imaging because of the camera's ability to emit a larger flash and sense the spatial relationships and dimensions of area of interest with the returned energy. This allows for more accurate imaging because the captured frames do not need to be stitched together, and the system is not sensitive to platform motion. This results in less distortion.<ref>{{Cite web|url=https://asc3d.com/our-technology/|title=Advanced Scientific Concepts Inc|website=asc3d.com|access-date=2022-07-03}}</ref> 3-D imaging can be achieved using both scanning and non-scanning systems. "3-D gated viewing laser radar" is a non-scanning laser ranging system that applies a pulsed laser and a fast gated camera. Research has begun for virtual beam steering using [[Digital Light Processing]] (DLP) technology. Imaging lidar can also be performed using arrays of high speed detectors and modulation sensitive detector arrays typically built on single chips using [[complementary metal–oxide–semiconductor]] (CMOS) and hybrid CMOS/[[Charge-coupled device|charge-coupled device]] (CCD) fabrication techniques. In these devices each pixel performs some local processing such as demodulation or gating at high speed, downconverting the signals to video rate so that the array can be read like a camera. Using this technique many thousands of pixels / channels may be acquired simultaneously.<ref>{{cite patent |country=US |number=5081530 |status=patent |title=Three Dimensional Camera and Rangefinder |gdate=1992-01-14 |invent1=Medina, Antonio |assign1=Medina, Antonio}}</ref> High resolution 3-D lidar cameras use [[homodyne detection]] with an electronic CCD or CMOS [[Shutter (photography)|shutter]].<ref name="Medina A, Gayá F, and Pozo F 800–805 http://www.opticsinfobase.org/josaa/abstract.cfm?URI=josaa-23-4-800">{{Cite journal|vauthors=Medina A, Gayá F, Pozo F|year=2006|title=Compact laser radar and three-dimensional camera|journal= Journal of the Optical Society of America A|volume=23|issue=4|pages=800–805|bibcode=2006JOSAA..23..800M|doi=10.1364/josaa.23.000800|pmid=16604759}}</ref> A coherent imaging lidar uses [[synthetic array heterodyne detection]] to enable a staring single element receiver to act as though it were an imaging array.<ref name="Strauss" /> In 2014, [[MIT Lincoln Laboratory|Lincoln Laboratory]] announced a new imaging chip with more than 16,384 pixels, each able to image a single photon, enabling them to capture a wide area in a single image. An earlier generation of the technology with one fourth as many pixels was dispatched by the U.S. military after the January 2010 Haiti earthquake. A single pass by a business jet at {{cvt|3000|m|ft|-3}} over Port-au-Prince was able to capture instantaneous snapshots of {{cvt|600|m|ft|-2}} squares of the city at a resolution of {{cvt|30|cm|ft|0}}, displaying the precise height of rubble strewn in city streets.<ref>{{cite web|url=https://www.technologyreview.com/s/524166/the-worlds-most-powerful-3-d-laser-imager/|title=The World's Most Powerful 3-D Laser Imager|date=2014-02-13|website=technologyreview.com|access-date=2017-04-06}}</ref> The new system is ten times better, and could produce much larger maps more quickly. The chip uses [[indium gallium arsenide]] (InGaAs), which operates in the infrared spectrum at a relatively long wavelength that allows for higher power and longer ranges. In many applications, such as self-driving cars, the new system will lower costs by not requiring a mechanical component to aim the chip. InGaAs uses less hazardous wavelengths than conventional silicon detectors, which operate at visual wavelengths.<ref>{{cite magazine|last=Talbot|first=David|date=2014-02-13|title=New Optical Chip Will Sharpen Military and Archeological Aerial Imaging|url=http://www.technologyreview.com/news/524166/the-worlds-most-powerful-3-d-laser-imager|access-date=2014-02-17|magazine=[[MIT Technology Review]]}}</ref> New technologies for infrared [[Photon counting|single-photon counting]] LIDAR are advancing rapidly, including arrays and cameras in a variety of [[Single-photon avalanche diode|semiconductor]] and [[Superconducting nanowire single-photon detector|superconducting]] platforms.<ref>{{Cite journal |url=https://opg.optica.org/optica/viewmedia.cfm?uri=optica-10-9-1124&html=true |access-date=2023-08-29 |journal=Optica |doi=10.1364/optica.488853 | title=Single-photon detection for long-range imaging and sensing | date=2023 | last1=Hadfield | first1=Robert H. | last2=Leach | first2=Jonathan | last3=Fleming | first3=Fiona | last4=Paul | first4=Douglas J. | last5=Tan | first5=Chee Hing | last6=Ng | first6=Jo Shien | last7=Henderson | first7=Robert K. | last8=Buller | first8=Gerald S. | volume=10 | issue=9 | page=1124 | doi-access=free |bibcode=2023Optic..10.1124H | hdl=20.500.11820/4d60bb02-3c2c-4f86-a737-f985cb8613d8 | hdl-access=free }}</ref>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Lidar
(section)
Add topic