Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Lidar
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{short description|Method of spatial measurement using laser}} {{other uses}} <!--Please see talk page before changing the spelling of lidar to LIDAR--> '''Lidar''' ({{IPAc-en|ˈ|l|aɪ|d|ɑːr}}, also '''LIDAR''', an acronym of "light detection and ranging"<ref name="NOAA">{{cite web |last1=National Oceanic and Atmospheric Administration |date=26 February 2021 |title=What is LIDAR |url=https://oceanservice.noaa.gov/facts/lidar.html|access-date=15 March 2021 |website=oceanservice.noaa.gov |publisher=US Department of Commerce }}</ref> or "laser imaging, detection, and ranging"<ref>[https://books.google.com/books?id=nWytDwAAQBAJ&dq=laser%20imaging%2C%20detection%2C%20and%20ranging&pg=PA239 Travis S. Taylor (2019). ''Introduction to Laser Science and Engineering''. CRC Press]</ref>) is a method for determining [[ranging|ranges]] by targeting an object or a surface with a [[laser]] and measuring the time for the reflected light to return to the receiver. Lidar may operate in a fixed direction (e.g., vertical) or it may scan multiple directions, in a special combination of [[3-D scanning]] and [[laser scanning]].<ref>[https://books.google.com/books?id=N_ErDwAAQBAJ&dq=laser+imaging%2C+detection%2C+and+ranging&pg=SA9-PA39 Jie Shan and Charles K. Toth (2018). ''Topographic Laser Ranging and Scanning: Principles and Processing'' (2nd ed.). CRC Press]</ref> [[File:Effigy mounds lidar.jpg|thumb|Lidar-derived image of Marching Bears Mound Group, [[Effigy Mounds National Monument]], [[United States]]]] [[File:Starfire Optical Range - sodium laser.jpg|thumb|upright|A [[frequency addition source of optical radiation]] (FASOR) used at the [[Starfire Optical Range]] for lidar and [[laser guide star]] experiments is tuned to the [[Fraunhofer lines|sodium D2a line]] and used to excite [[sodium]] atoms [[sodium layer|in the upper atmosphere]].]] [[File:Lidar P1270901.jpg|thumb|upright|This lidar may be used to scan buildings, rock formations, et cetera, to produce a 3D model. The lidar can aim its laser beam in a wide range: its head rotates horizontally; a mirror tilts vertically. The laser beam is used to measure the distance to the first object on its path.]] {{multiple image | width = 200 | footer = This visualisation shows an airplane collecting a 50 km swathe of lidar data over the Brazilian rainforest. For ground-level features, colours range from deep brown to tan. Vegetation heights are depicted in shades of green, where dark greens are closest to the ground and light greens are the highest. | image1 = 50 Kilometers of Brazilian Forest Canopy.webm | caption1 = An airplane collecting treetop data over a Brazilian rainforest | image2 = Flying Through LIDAR Canopy Data.webm | caption2 = In this view, the viewer flies down to the rainforest canopy and flies through the virtual leaves. }} Lidar has terrestrial, airborne, and mobile applications.<ref>{{Cite web |date=2021-06-29 |title=Adoption of gallium-based lidar sensors gathers pace |url=https://www.argusmedia.com/en/news/2229445-adoption-of-galliumbased-lidar-sensors-gathers-pace |access-date=2021-07-14 |website=www.argusmedia.com }}</ref><ref>{{Cite web|title=Ecologists compare accuracy of Lidar technologies for monitoring forest vegetation: Findings suggest mobile platforms have great potential for monitoring a variety of forest attributes|url=https://www.sciencedaily.com/releases/2021/07/210708185947.htm|access-date=2021-07-14|website=ScienceDaily}}</ref> It is commonly used to make high-resolution maps, with applications in [[surveying]], [[geodesy]], [[geomatics]], [[archaeology]], [[geography]], [[geology]], [[geomorphology]], [[seismology]], [[forestry]], [[atmospheric physics]],<ref name="cracknell">{{Cite book| last1 = Cracknell| first1 = Arthur P. | last2 = Hayes| first2 = Ladson| title = Introduction to Remote Sensing| place=London| publisher = Taylor and Francis| year = 2007| orig-date = 1991| edition = 2| oclc = 70765252| isbn = 978-0-8493-9255-9}}</ref> [[laser guidance]], airborne laser swathe mapping (ALSM), and [[Mars Orbiter Laser Altimeter|laser altimetry]]. It is used to make digital [[3D modeling|3-D representations]] of areas on the Earth's surface and ocean bottom of the intertidal and near coastal zone by varying the wavelength of light. It has also been increasingly used in control and navigation for [[autonomous car]]s<ref name=":13">{{Cite journal|last1=Lim|first1=Hazel Si Min|last2=Taeihagh|first2=Araz|date=2019|title=Algorithmic Decision-Making in AVs: Understanding Ethical and Technical Concerns for Smart Cities|journal=Sustainability |volume=11 |issue=20 |page=5791|doi=10.3390/su11205791|arxiv=1910.13122|doi-access=free|bibcode=2019Sust...11.5791L }}</ref> and for the [[Ingenuity (helicopter)|helicopter ''Ingenuity'']] on its record-setting flights over the terrain of [[Mars]].<ref name="ieee-2021">{{cite news|url=https://spectrum.ieee.org/nasa-designed-perseverance-helicopter-rover-fly-autonomously-mars|title=How NASA Designed a Helicopter That Could Fly Autonomously on Mars|date=17 February 2021|work=IEEE Spectrum|access-date=19 February 2021 |archive-date=19 February 2021|archive-url=https://web.archive.org/web/20210219054558/https://spectrum.ieee.org/automaton/aerospace/robotic-exploration/nasa-designed-perseverance-helicopter-rover-fly-autonomously-mars|url-status=live}}</ref> Lidar has since been used extensively for atmospheric research and [[meteorology]]. Lidar instruments fitted to [[aircraft]] and [[satellite]]s carry out [[surveying]] and mapping{{snd}} a recent example being the U.S. Geological Survey Experimental Advanced Airborne Research Lidar.<ref>"Experimental Advanced Advanced Research Lidar", ''USGS.gov''. Retrieved 8 August 2007.</ref> [[NASA]] has identified lidar as a key technology for enabling autonomous precision safe landing of future robotic and crewed lunar-landing vehicles.<ref name="auto">{{cite book |author1=Amzajerdian, Farzin |author2=Pierrottet, Diego F. |author3=Petway, Larry B. |author4=Hines, Glenn D. |author5=Roback, Vincent E. |title=International Symposium on Photoelectronic Detection and Imaging 2011: Laser Sensing and Imaging; and Biological and Medical Applications of Photonics Sensing and Imaging |chapter=Lidar systems for precision navigation and safe landing on planetary bodies |chapter-url=https://ntrs.nasa.gov/search.jsp?R=20110012163 |volume=8192 |page=819202 |access-date=May 24, 2011 |date=2011-05-24 |bibcode=2011SPIE.8192E..02A |doi=10.1117/12.904062 |hdl=2060/20110012163 |s2cid=28483836 |hdl-access=free}}</ref> The evolution of [[quantum technology]] has given rise to the emergence of Quantum Lidar, demonstrating higher efficiency and sensitivity when compared to conventional lidar systems.<ref name="Gallego Torromé Barzanjeh 2023 p. 100497">{{cite journal | last1=Gallego Torromé | first1=Ricardo | last2=Barzanjeh | first2=Shabir | title=Advances in quantum radar and quantum LiDAR | journal=Progress in Quantum Electronics | date=2023 | volume=93 | doi=10.1016/j.pquantelec.2023.100497 | page=100497| arxiv=2310.07198 }}</ref> == History and etymology == The essential concept of lidar was originated by [[Edward Hutchinson Synge|E. H. Synge]] in 1930, who envisaged the use of powerful searchlights to probe the atmosphere.<ref>''Philosophical Magazine and Journal of Science'', 1930, series 7, volume 9, issue 60, pp. 1014–1020.</ref><ref>Donegan, J. F.; ''[https://www.livingedition.at/en/isbn/9783901585173/ The Life and Works of Edward Hutchinson Synge]'', pp. 31, 67, (co-edited with D. Weaire and [[Petros Serghiou Florides|P. Florides]]), Pöllauberg, Austria : Living Edition, {{ISBN|3-901585-17-6}}.</ref> Under the direction of Malcolm Stitch, the [[Hughes Aircraft Company]] introduced the first lidar-like system in 1961,<ref>{{cite news |title=New Radar System |work=Odessa American |date=28 Feb 1961}}</ref><ref name="Macomber">{{cite news |last1=Macomber |first1=Frank |title=Space Experts Seek Harness for Powerful LASER Light |url=https://www.newspaperarchive.com/us/california/bakersfield/bakersfield-californian/1963/06-03/page-5/ |access-date=11 July 2019 |work=Bakersfield Californian |agency=Copley News Service |issue=5 |date=June 3, 1963}}</ref> shortly after the invention of the laser. Intended for satellite tracking, this system combined laser-focused imaging with the ability to calculate distances by measuring the time for a signal to return using appropriate sensors and data acquisition electronics. It was originally called "Colidar" an acronym for "coherent light detecting and ranging",<ref name="stitch">{{cite journal |first1=M. L. |last1=Stitch |first2=E. J. |last2=Woodburry |first3=J H. |last3=Morse |title=Optical ranging system uses laser transmitter |journal=Electronics |date=21 April 1961 |volume=34 |pages=51–53}}</ref> derived from the term "[[radar]]", itself an acronym for "radio detection and ranging". All{{citation needed|date=November 2023}} laser [[Rangefinding telemeter|rangefinder]]s, laser altimeters and lidar units are derived from the early colidar systems. The first practical terrestrial application of a colidar system was the "Colidar Mark II", a large rifle-like laser rangefinder produced in 1963, which had a range of 11 km and an accuracy of 4.5 m, to be used for military targeting.<ref>{{cite news |title=Laser Measures Distance |work=Lincoln Journal Star |issue=6 |date=29 March 1963}}</ref><ref name="Macomber" /> The first mention of lidar as a stand-alone word in 1963 suggests that it originated as a portmanteau of "[[light]]" and "radar": "Eventually the laser may provide an extremely sensitive detector of particular wavelengths from distant objects. Meanwhile, it is being used to study the Moon by 'lidar' (light radar) ..."<ref name="James Ring p. 672-3">James Ring, "The Laser in Astronomy", pp. 672–673, ''New Scientist'', June 20, 1963.</ref><ref name=Oxford>{{cite book |title=Oxford English Dictionary |year=2013 |page=Entry for "lidar" |url=http://www.oed.com/}}</ref> The name "[[photonic radar]]" is sometimes used to mean visible-spectrum range finding like lidar.<ref name="auto2">{{Cite web |url=https://www.technion.ac.il/en/2016/05/19138/ |title=Photonic Radar |website=Technion – Israel Institute of Technology |date=27 May 2016 |access-date=2018-08-12}}</ref><ref name="auto1">{{Cite web |url=http://fullafterburner.weebly.com/next-gen-weapons/radio-optic-phased-array-radar-a-comprehensive-study |title=Radio Optic Phased Array Radar – a comprehensive study |website=Full Afterburner |access-date=2018-08-12}}</ref> Lidar's first applications were in meteorology, for which the [[National Center for Atmospheric Research]] used it to measure [[clouds]] and pollution.<ref name=Goyer>{{cite journal |last=Goyer |first=G. G. |author2=R. Watson |title=The Laser and its Application to Meteorology |journal=Bulletin of the American Meteorological Society |date=September 1963 |volume=44 |issue=9 |pages=564–575 [568] |doi=10.1175/1520-0477-44.9.564 |bibcode=1963BAMS...44..564G |doi-access=free}}</ref> The general public became aware of the accuracy and usefulness of lidar systems in 1971 during the [[Apollo 15]] mission, when astronauts used a laser altimeter to map the surface of the Moon. Although the English language no longer treats "radar" as an acronym, (i.e., uncapitalized), the word "lidar" was capitalized as "LIDAR" or "LiDAR" in some publications beginning in the 1980s. No consensus exists on capitalization. Various publications refer to lidar as "LIDAR", "LiDAR", "LIDaR", or "Lidar". The [[USGS]] uses both "LIDAR" and "lidar", sometimes in the same document;<ref>{{cite web |url=http://lidar.cr.usgs.gov/ |title=CLICK |website=Lidar.cr.usgs.gov |date=2015-09-16 |access-date=2016-02-22 |archive-url=https://web.archive.org/web/20160219045753/http://lidar.cr.usgs.gov/ |archive-date=2016-02-19 }}</ref> the ''[[New York Times]]'' predominantly uses "lidar" for staff-written articles,<ref>{{cite web |url=https://query.nytimes.com/search/sitesearch/?action=click&contentCollection=Science®ion=TopBar&WT.nav=searchWidget&module=SearchSubmit&pgtype=article#/lidar/since1851/document_type%3A%22article%22/ |title=NYTimes.com search |website=[[The New York Times]] |access-date=2017-04-07}}</ref> although contributing news feeds such as [[Reuters]] may use Lidar.<ref>{{cite web |url=https://www.nytimes.com/reuters/2017/03/29/technology/29reuters-uber-tech-alphabet-lawsuit.html |title=Waymo Self-Driving Unit Sought Arbitration Over Engineer Now at Uber |website=[[The New York Times]] |date=2017-03-29 |access-date=2017-04-07}}</ref> == Theory == {{multiple image | align = right | direction = horizontal | total_width = 700 | image1 = 20200501 Time of flight.svg | caption1 = Basic time-of-flight principles applied to laser range-finding | image2 = Amazon Canopy Comes to Life through Laser Data.webm | caption2 = Flying over the Brazilian Amazon with a lidar instrument | image3 = Collecting LIDAR data over the Ganges and Brahmaputra River Basin.ogg | caption3 = Animation of a satellite collecting digital elevation map data over the Ganges and Brahmaputra River basin using lidar }} Lidar uses [[ultraviolet]], [[Interferometric visibility|visible]], or [[near infrared]] light to image objects. It can target a wide range of materials, including non-metallic objects, rocks, rain, chemical compounds, [[aerosols]], clouds and even single [[molecule]]s.<ref name="cracknell" /> A narrow laser beam can map physical features with very high [[Optical resolution|resolutions]]; for example, an aircraft can map terrain at {{convert|30|cm|adj=on}} resolution or better.<ref>{{cite web |author1=Carter, Jamie |author2=Keil Schmid |author3=Kirk Waters |author4=Lindy Betzhold |author5=Brian Hadley |author6=Rebecca Mataosky |author7=Jennifer Halleran |title=Lidar 101: An Introduction to Lidar Technology, Data, and Applications |date=2012 |page=14 |url=https://coast.noaa.gov/data/digitalcoast/pdf/lidar-101.pdf |archive-url=https://ghostarchive.org/archive/20221009/https://coast.noaa.gov/data/digitalcoast/pdf/lidar-101.pdf |archive-date=2022-10-09 |url-status=live |website=NOAA Coastal Services Center |access-date=2017-02-11}}</ref> Wavelengths vary to suit the target: from about 10 [[micrometer (unit)|micrometers]] ([[infrared]]) to approximately 250 [[nanometer]]s ([[ultraviolet]]). Typically, light is reflected via [[backscatter]]ing, as opposed to pure reflection one might find with a mirror. Different types of scattering are used for different lidar applications: most commonly [[Rayleigh scattering]], [[Mie scattering]], [[Raman scattering]], and [[fluorescence]].<ref name="cracknell" /> Suitable combinations of wavelengths can allow remote mapping of atmospheric contents by identifying wavelength-dependent changes in the intensity of the returned signal.<ref>{{Cite book |url=https://books.google.com/books?id=zGlQDwAAQBAJ&pg=PA678 |title=Handbook of Optoelectronics: Concepts, Devices, and Techniques (Volume One) |last1=P. Dakin |first1=John |last2=Brown |first2=Robert |publisher=CRC Press |year=2017 |isbn=978-1-4822-4179-2|page=678}}</ref> The name "photonic radar" is sometimes used to mean visible-spectrum range finding like lidar,<ref name="auto2"/><ref name="auto1"/> although [[photonic radar]] more strictly refers to radio-frequency range finding using [[photonics]] components. A lidar determines the distance of an object or a surface with the [[formula]]:<ref>{{Cite web |last=Podest |first=Erika |date=2021-03-16 |title=The Fundamentals of LiDAR |url=https://appliedsciences.nasa.gov/sites/default/files/2021-03/SIF_LIDAR_Podest_Final.pdf |access-date=2024-09-07 |website=NASA}}</ref> :<math>d=\frac{c\cdot t}{2}</math> where ''c'' is the [[speed of light]], ''d'' is the distance between the detector and the object or surface being detected, and ''t'' is the time spent for the laser light to travel to the object or surface being detected, then travel back to the detector. The two kinds of lidar detection schemes are "incoherent" or direct energy detection (which principally measures amplitude changes of the reflected light) and [[Coherence (physics)|coherent]] detection (best for measuring [[Doppler effect|Doppler]] shifts, or changes in the phase of the reflected light). Coherent systems generally use [[optical heterodyne detection]].<ref>{{cite book|title=Laser – Surface Interactions|last=Rashid A. Ganeev|publisher=Springer Science & Business Media|url= https://books.google.com/books?id=H8DEBAAAQBAJ&pg=PR2|page=32|isbn=978-94-007-7341-7|year=2013}}</ref> This is more sensitive than direct detection and allows them to operate at much lower power, but requires more complex transceivers. Both types employ pulse models: either ''micropulse'' or ''high energy''. Micropulse systems utilize intermittent bursts of energy. They developed as a result of ever-increasing computer power, combined with advances in laser technology. They use considerably less energy in the laser, typically on the order of one [[Joule#Microjoule|microjoule]], and are often "eye-safe", meaning they can be used without safety precautions. High-power systems are common in atmospheric research, where they are widely used for measuring atmospheric parameters: the height, layering and densities of clouds, cloud particle properties ([[refractive index#Dispersion and absorption|extinction coefficient]], backscatter coefficient, [[depolarization]]), temperature, pressure, wind, humidity, and trace gas concentration (ozone, methane, [[nitrous oxide]], etc.).<ref name="cracknell" /> == Components == [[File:LIDAR-scanned-SICK-LMS-animation.gif|thumb|A basic lidar system involves a laser range finder reflected by a rotating mirror (top). The laser is scanned around the scene being digitized, in one or two dimensions (middle), gathering distance measurements at specified angle intervals (bottom).]] {{more citations needed section|date=April 2017}} === Laser === 600–1,000 [[nanometre|nm]] [[laser]]s are most common for non-scientific applications. The maximum power of the laser is limited, or an automatic shut-off system which turns the laser off at specific altitudes is used in order to make it eye-safe for the people on the ground. One common alternative, 1,550 nm lasers, are eye-safe at relatively high power levels since this wavelength is not strongly absorbed by the eye. A trade-off though is that current detector technology is less advanced, so these wavelengths are generally used at longer ranges with lower accuracies. They are also used for military applications because 1,550 nm is not visible in [[Night vision device|night vision goggles]], unlike the shorter 1,000 nm infrared laser. Airborne topographic mapping lidars generally use 1,064 nm diode-pumped [[Yttrium aluminium garnet|YAG]] lasers, while [[bathymetric]] (underwater depth research) systems generally use 532 nm [[Second-harmonic generation|frequency-doubled]] diode pumped YAG lasers because 532 nm penetrates water with much less [[attenuation]] than 1,064 nm. Laser settings include the laser repetition rate (which controls the data collection speed). Pulse length is generally an attribute of the laser cavity length, the number of passes required through the gain material (YAG, [[YLF]], etc.), and [[Q-switch]] (pulsing) speed. Better target resolution is achieved with shorter pulses, provided the lidar receiver detectors and electronics have sufficient bandwidth.<ref name="cracknell" /> A [[phased array]] can illuminate any direction by using a microscopic array of individual antennas. Controlling the timing (phase) of each antenna steers a cohesive signal in a specific direction. Phased arrays have been used in radar since the 1940s. On the order of a million optical antennas are used to see a radiation pattern of a certain size in a certain direction. To achieve this the phase of each individual antenna (emitter) are precisely controlled. It is very difficult, if possible at all, to use the same technique in a lidar. The main problems are that all individual emitters must be coherent (technically coming from the same "master" oscillator or laser source), have dimensions about the wavelength of the emitted light (1 micron range) to act as a point source with their phases being controlled with high accuracy. [[MEMS|Microelectromechanical mirrors (MEMS)]] are not entirely solid-state. However, their tiny form factor provides many of the same cost benefits. A single laser is directed to a single mirror that can be reoriented to view any part of the target field. The mirror spins at a rapid rate. However, MEMS systems generally operate in a single plane (left to right). To add a second dimension generally requires a second mirror that moves up and down. Alternatively, another laser can hit the same mirror from another angle. MEMS systems can be disrupted by shock/vibration and may require repeated calibration.<ref name=":11">{{Cite news |last=Mokey |first=Nick |date=2018-03-15 |title=A self-driving car in every driveway? Solid-state lidar is the key |work=Digital Trends |url=https://www.digitaltrends.com/cars/solid-state-lidar-for-self-driving-cars/ |access-date=2018-06-15}}</ref> === Scanner and optics === Image development speed is affected by the speed at which they are scanned. Options to scan the [[azimuth]] and elevation include dual oscillating plane mirrors, a combination with a polygon mirror, and a [[Laser scanning|dual axis scanner]]. Optic choices affect the angular resolution and range that can be detected. A hole mirror or a [[beam splitter]] are options to collect a return signal. === Photodetector and receiver electronics === Two main [[photodetector]] technologies are used in lidar: [[Solid-state electronics|solid-state]] photodetectors, such as silicon avalanche [[photodiode]]s, or [[photomultiplier]]s. The sensitivity of the receiver is another parameter that has to be balanced in a lidar design. === Position and navigation systems === Lidar sensors mounted on mobile platforms such as airplanes or satellites require instrumentation to determine the absolute position and orientation of the sensor. Such devices generally include a [[Global Positioning System]] receiver and an [[inertial measurement unit]] (IMU). === Sensor === Lidar uses active sensors that supply their own illumination source. The energy source hits objects and the reflected energy is detected and measured by sensors. Distance to the object is determined by recording the time between transmitted and backscattered pulses and by using the speed of light to calculate the distance traveled.<ref name="NASA">{{Cite web|url=https://earthdata.nasa.gov/user-resources/remote-sensors#hyperspectral|title=Remote Sensors {{!}} Earthdata|website=earthdata.nasa.gov|access-date=2017-03-18}} {{PD-notice}}</ref> Flash lidar allows for 3-D imaging because of the camera's ability to emit a larger flash and sense the spatial relationships and dimensions of area of interest with the returned energy. This allows for more accurate imaging because the captured frames do not need to be stitched together, and the system is not sensitive to platform motion. This results in less distortion.<ref>{{Cite web|url=https://asc3d.com/our-technology/|title=Advanced Scientific Concepts Inc|website=asc3d.com|access-date=2022-07-03}}</ref> 3-D imaging can be achieved using both scanning and non-scanning systems. "3-D gated viewing laser radar" is a non-scanning laser ranging system that applies a pulsed laser and a fast gated camera. Research has begun for virtual beam steering using [[Digital Light Processing]] (DLP) technology. Imaging lidar can also be performed using arrays of high speed detectors and modulation sensitive detector arrays typically built on single chips using [[complementary metal–oxide–semiconductor]] (CMOS) and hybrid CMOS/[[Charge-coupled device|charge-coupled device]] (CCD) fabrication techniques. In these devices each pixel performs some local processing such as demodulation or gating at high speed, downconverting the signals to video rate so that the array can be read like a camera. Using this technique many thousands of pixels / channels may be acquired simultaneously.<ref>{{cite patent |country=US |number=5081530 |status=patent |title=Three Dimensional Camera and Rangefinder |gdate=1992-01-14 |invent1=Medina, Antonio |assign1=Medina, Antonio}}</ref> High resolution 3-D lidar cameras use [[homodyne detection]] with an electronic CCD or CMOS [[Shutter (photography)|shutter]].<ref name="Medina A, Gayá F, and Pozo F 800–805 http://www.opticsinfobase.org/josaa/abstract.cfm?URI=josaa-23-4-800">{{Cite journal|vauthors=Medina A, Gayá F, Pozo F|year=2006|title=Compact laser radar and three-dimensional camera|journal= Journal of the Optical Society of America A|volume=23|issue=4|pages=800–805|bibcode=2006JOSAA..23..800M|doi=10.1364/josaa.23.000800|pmid=16604759}}</ref> A coherent imaging lidar uses [[synthetic array heterodyne detection]] to enable a staring single element receiver to act as though it were an imaging array.<ref name="Strauss" /> In 2014, [[MIT Lincoln Laboratory|Lincoln Laboratory]] announced a new imaging chip with more than 16,384 pixels, each able to image a single photon, enabling them to capture a wide area in a single image. An earlier generation of the technology with one fourth as many pixels was dispatched by the U.S. military after the January 2010 Haiti earthquake. A single pass by a business jet at {{cvt|3000|m|ft|-3}} over Port-au-Prince was able to capture instantaneous snapshots of {{cvt|600|m|ft|-2}} squares of the city at a resolution of {{cvt|30|cm|ft|0}}, displaying the precise height of rubble strewn in city streets.<ref>{{cite web|url=https://www.technologyreview.com/s/524166/the-worlds-most-powerful-3-d-laser-imager/|title=The World's Most Powerful 3-D Laser Imager|date=2014-02-13|website=technologyreview.com|access-date=2017-04-06}}</ref> The new system is ten times better, and could produce much larger maps more quickly. The chip uses [[indium gallium arsenide]] (InGaAs), which operates in the infrared spectrum at a relatively long wavelength that allows for higher power and longer ranges. In many applications, such as self-driving cars, the new system will lower costs by not requiring a mechanical component to aim the chip. InGaAs uses less hazardous wavelengths than conventional silicon detectors, which operate at visual wavelengths.<ref>{{cite magazine|last=Talbot|first=David|date=2014-02-13|title=New Optical Chip Will Sharpen Military and Archeological Aerial Imaging|url=http://www.technologyreview.com/news/524166/the-worlds-most-powerful-3-d-laser-imager|access-date=2014-02-17|magazine=[[MIT Technology Review]]}}</ref> New technologies for infrared [[Photon counting|single-photon counting]] LIDAR are advancing rapidly, including arrays and cameras in a variety of [[Single-photon avalanche diode|semiconductor]] and [[Superconducting nanowire single-photon detector|superconducting]] platforms.<ref>{{Cite journal |url=https://opg.optica.org/optica/viewmedia.cfm?uri=optica-10-9-1124&html=true |access-date=2023-08-29 |journal=Optica |doi=10.1364/optica.488853 | title=Single-photon detection for long-range imaging and sensing | date=2023 | last1=Hadfield | first1=Robert H. | last2=Leach | first2=Jonathan | last3=Fleming | first3=Fiona | last4=Paul | first4=Douglas J. | last5=Tan | first5=Chee Hing | last6=Ng | first6=Jo Shien | last7=Henderson | first7=Robert K. | last8=Buller | first8=Gerald S. | volume=10 | issue=9 | page=1124 | doi-access=free |bibcode=2023Optic..10.1124H | hdl=20.500.11820/4d60bb02-3c2c-4f86-a737-f985cb8613d8 | hdl-access=free }}</ref> ==Classification== Lidar can be oriented to [[nadir]], [[zenith]], or laterally. For example, lidar altimeters look down, an atmospheric lidar looks up, and lidar-based [[collision avoidance system]]s are side-looking. Laser projections of lidars can be manipulated using various methods and mechanisms to produce a scanning effect: the standard spindle-type, which spins to give a 360-degree view; solid-state lidar, which has a fixed field of view, but no moving parts, and can use either MEMS or optical phased arrays to steer the beams; and flash lidar, which spreads a flash of light over a large field of view before the signal bounces back to a detector.<ref name=":14">{{Cite web|title=The Wild West of Automotive Lidar|url=https://spie.org/news/photonics-focus/marapr-2020/wild-west-of-automotive-lidar|access-date=2020-12-26|website=spie.org}}</ref> Lidar applications can be divided into airborne and terrestrial types.<ref name=":0">{{Cite book|title = Airborne and terrestrial laser scanning|last1 = Vosselman|first1 = George|publisher = Whittles Publishing|year = 2012|isbn = 978-1-904445-87-6|last2 = Maas|first2 = Hans-Gerd}}</ref> The two types require scanners with varying specifications based on the data's purpose, the size of the area to be captured, the range of measurement desired, the cost of equipment, and more. Spaceborne platforms are also possible, see [[satellite laser altimetry]]. Airborne lidar (also ''airborne laser scanning'') is when a laser scanner, while attached to an aircraft during flight, creates a [[Point cloud|3-D point cloud]] model of the landscape. This is currently the most detailed and accurate method of creating [[digital elevation model]]s, replacing [[photogrammetry]]. One major advantage in comparison with photogrammetry is the ability to filter out reflections from vegetation from the point cloud model to create a [[Digital Terrain Model|digital terrain model]] which represents ground surfaces such as rivers, paths, cultural heritage sites, etc., which are concealed by trees. Within the category of airborne lidar, there is sometimes a distinction made between high-altitude and low-altitude applications, but the main difference is a reduction in both accuracy and point density of data acquired at higher altitudes. Airborne lidar can also be used to create bathymetric models in shallow water.<ref>{{Cite journal|title = Airborne laser bathymetry for documentation of submerged archaeological sites in shallow water|journal = ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences|pages = 99–107|volume = XL-5/W5|doi = 10.5194/isprsarchives-xl-5-w5-99-2015|first1 = M.|last1 = Doneus|first2 = I.|last2 = Miholjek|first3 = G.|last3 = Mandlburger|first4 = N.|last4 = Doneus|first5 = G.|last5 = Verhoeven|first6 = Ch.|last6 = Briese|first7 = M.|last7 = Pregesbauer|bibcode = 2015ISPArXL55...99D |year = 2015|doi-access = free|hdl = 1854/LU-5933247|hdl-access = free}}</ref> The main constituents of airborne lidar include [[digital elevation model]]s (DEM) and digital surface models (DSM). The points and ground points are the vectors of discrete points while DEM and DSM are interpolated raster grids of discrete points. The process also involves capturing of digital aerial photographs. To interpret deep-seated landslides for example, under the cover of vegetation, scarps, tension cracks or tipped trees airborne lidar is used. Airborne lidar digital elevation models can see through the canopy of forest cover, perform detailed measurements of scarps, erosion and tilting of electric poles.<ref>{{Cite journal|last1=Chiu|first1=Cheng-Lung|last2=Fei|first2=Li-Yuan|last3=Liu|first3=Jin-King|last4=Wu|first4=Ming-Chee|title=National Airborne Lidar Mapping and Examples for applications in deep-seated landslides in Taiwan|journal=Geoscience and Remote Sensing Symposium (IGARSS), 2015 IEEE International|issn= 2153-7003}}</ref> Airborne lidar data is processed using a toolbox called Toolbox for Lidar Data Filtering and Forest Studies (TIFFS)<ref name=":10" /> for lidar data filtering and terrain study software. The data is interpolated to digital terrain models using the software. The laser is directed at the region to be mapped and each point's height above the ground is calculated by subtracting the original z-coordinate from the corresponding digital terrain model elevation. Based on this height above the ground the non-vegetation data is obtained which may include objects such as buildings, electric power lines, flying birds, insects, etc. The rest of the points are treated as vegetation and used for modeling and mapping. Within each of these plots, lidar metrics are calculated by calculating statistics such as mean, standard deviation, skewness, percentiles, quadratic mean, etc.<ref name=":10">{{Cite journal|last1=Yuan|first1=Zeng|last2=Yujin|first2=Zhao|last3=Dan|first3=Zhao|last4=Bingfang|first4=Wu|title=Forest Biodiversity mapping using airborne and hyper-spectral data|journal=Geoscience and Remote Sensing Symposium (IGARSS), 2016 IEEE International |issn=2153-7003}}</ref> [[File:Yellowscan LIDAR on OnyxStar FOX-C8 HD.jpg|thumb|Lidar scanning performed with a multicopter [[Unmanned aerial vehicle|UAV]]]] Multiple commercial lidar systems for [[unmanned aerial vehicle]]s are currently on the market. These platforms can systematically scan large areas, or provide a cheaper alternative to manned aircraft for smaller scanning operations.<ref>{{Cite journal|title = Drone remote sensing for forestry research and practices|journal = Journal of Forestry Research|date = 2015-06-21|issn = 1007-662X|pages = 791–797|volume = 26|issue = 4|doi = 10.1007/s11676-015-0088-y |first1 = Lina|last1 = Tang|first2 = Guofan|last2 = Shao| bibcode=2015JFoR...26..791T |s2cid = 15695164}}</ref> [[File:Airborne Lidar Bathymetric Technology.jpg|thumb|Airborne Lidar Bathymetric Technology-High-resolution multibeam lidar map showing spectacularly faulted and deformed seafloor geology, in shaded relief and coloured by depth]] The airborne lidar [[bathymetric]] technological system involves the measurement of [[time of flight]] of a signal from a source to its return to the sensor. The data acquisition technique involves a sea floor mapping component and a ground truth component that includes video transects and sampling. It works using a green spectrum (532 nm) laser beam.<ref name="Nayegandhi">{{Cite web|url=https://www.ngs.noaa.gov/corbin/class_description/Nayegandhi_green_lidar.pdf |archive-url=https://ghostarchive.org/archive/20221009/https://www.ngs.noaa.gov/corbin/class_description/Nayegandhi_green_lidar.pdf |archive-date=2022-10-09 |url-status=live|title=Nayegandhi Green Lidar}}</ref> Two beams are projected onto a fast rotating mirror, which creates an array of points. One of the beams penetrates the water and also detects the bottom surface of the water under favorable conditions. Water depth measurable by lidar depends on the clarity of the water and the absorption of the wavelength used. Water is most transparent to green and blue light, so these will penetrate deepest in clean water.<ref name="Bathymetric"/> Blue-green light of 532 nm produced by [[frequency doubled]] solid-state IR laser output is the standard for airborne bathymetry. This light can penetrate water but pulse strength attenuates exponentially with distance traveled through the water.<ref name="Nayegandhi" /> Lidar can measure depths from about {{cvt|0.9|to|40|m|ft|0}}, with vertical accuracy in the order of {{cvt|15|cm|in|0}}. The surface reflection makes water shallower than about {{cvt|0.9|m|ft|0}} difficult to resolve, and absorption limits the maximum depth. Turbidity causes scattering and has a significant role in determining the maximum depth that can be resolved in most situations, and dissolved pigments can increase absorption depending on wavelength.<ref name="Bathymetric">{{cite web |url=http://home.iitk.ac.in/~blohani/LiDAR_Tutorial/Bathymetric%20LiDAR.htm |title=1.2.2 Bathymetric LiDAR |website=home.iitk.ac.in |access-date=15 January 2023 }}</ref> Other reports indicate that water penetration tends to be between two and three times Secchi depth. Bathymetric lidar is most useful in the {{cvt|0|-|10|m|ft|0}} depth range in coastal mapping.<ref name="Nayegandhi" /> On average in fairly clear coastal seawater lidar can penetrate to about {{cvt|7|m|ft|0}}, and in turbid water up to about {{cvt|3|m|ft|0}}. An average value found by Saputra et al, 2021, is for the green laser light to penetrate water about one and a half to two times Secchi depth in Indonesian waters. Water temperature and salinity have an effect on the refractive index which has a small effect on the depth calculation.<ref name="Saputra et al 2021" >{{cite journal |last1=Saputra |first1=Romi |last2=Radjawane |first2=Ivonne |last3=Park |first3=H |last4=Gularso |first4=Herjuno |date=2021 |title=Effect of Turbidity, Temperature and Salinity of Waters on Depth Data from Airborne LiDAR Bathymetry |journal=IOP Conference Series: Earth and Environmental Science |volume=925 |issue=1 |page=012056 |doi=10.1088/1755-1315/925/1/012056 |bibcode=2021E&ES..925a2056S |s2cid=244918525 |doi-access=free }}</ref> The data obtained shows the full extent of the land surface exposed above the sea floor. This technique is extremely useful as it will play an important role in the major sea floor mapping program. The mapping yields onshore topography as well as underwater elevations. Sea floor reflectance imaging is another solution product from this system which can benefit mapping of underwater habitats. This technique has been used for three-dimensional image mapping of California's waters using a hydrographic lidar.<ref name="Wilson 2008">{{Cite book |isbn=978-1-4244-2126-8 |doi=10.1109/OCEANSKOBE.2008.4530980|chapter=Using Airborne Hydrographic LiDAR to Support Mapping of California's Waters|title=OCEANS 2008 - MTS/IEEE Kobe Techno-Ocean|pages=1–8|year=2008|last1=Wilson|first1=Jerry C.|s2cid=28911362}}</ref> Airborne lidar systems were traditionally able to acquire only a few peak returns, while more recent systems acquire and digitize the entire reflected signal.<ref name=":15" /> Scientists analysed the waveform signal for extracting peak returns using [[Gaussian decomposition]].<ref>{{Cite journal|last1=Wagner|first1=Wolfgang|last2=Ullrich|first2=Andreas|last3=Ducic|first3=Vesna|last4=Melzer|first4=Thomas|last5=Studnicka|first5=Nick|date=2006-04-01|title=Gaussian decomposition and calibration of a novel small-footprint full-waveform digitising airborne laser scanner|url=https://www.sciencedirect.com/science/article/pii/S0924271605001024|journal=ISPRS Journal of Photogrammetry and Remote Sensing|volume=60|issue=2|pages=100–112|doi=10.1016/j.isprsjprs.2005.12.001|bibcode=2006JPRS...60..100W|issn=0924-2716}}</ref> Zhuang et al, 2017 used this approach for estimating aboveground biomass.<ref>{{Cite journal|last1=Zhuang|first1=Wei|last2=Mountrakis|first2=Giorgos|last3=Wiley|first3=John J. Jr.|last4=Beier|first4=Colin M.|date=2015-04-03|title=Estimation of above-ground forest biomass using metrics based on Gaussian decomposition of waveform lidar data|journal=International Journal of Remote Sensing|volume=36|issue=7|pages=1871–1889|doi=10.1080/01431161.2015.1029095|bibcode=2015IJRS...36.1871Z|s2cid=55987035|issn=0143-1161}}</ref> Handling the huge amounts of full-waveform data is difficult. Therefore, Gaussian decomposition of the waveforms is effective, since it reduces the data and is supported by existing workflows that support interpretation of 3-D [[point cloud]]s. Recent studies investigated [[voxel]]isation. The intensities of the waveform samples are inserted into a voxelised space (3-D grayscale image) building up a 3-D representation of the scanned area.<ref name=":15">{{Cite book|last1=Miltiadou|first1=M.|last2=Grant|first2=Michael G.|last3=Campbell|first3=N. D. F.|last4=Warren|first4=M.|last5=Clewley|first5=D.|last6=Hadjimitsis|first6=Diofantos G.|title=Seventh International Conference on Remote Sensing and Geoinformation of the Environment (RSCy2019) |chapter=Open source software DASOS: Efficient accumulation, analysis, and visualisation of full-waveform lidar |editor1-first=Giorgos|editor1-last=Papadavid|editor2-first=Kyriacos|editor2-last=Themistocleous|editor3-first=Silas|editor3-last=Michaelides|editor4-first=Vincent|editor4-last=Ambrosia|editor5-first=Diofantos G|editor5-last=Hadjimitsis|date=2019-06-27|chapter-url=https://www.spiedigitallibrary.org/conference-proceedings-of-spie/11174/111741M/Open-source-software-DASOS--efficient-accumulation-analysis-and-visualisation/10.1117/12.2537915.short|publisher=International Society for Optics and Photonics|volume=11174|pages=111741M|doi=10.1117/12.2537915|bibcode=2019SPIE11174E..1MM|isbn=978-1-5106-3061-1|s2cid=197660590}}</ref> Related metrics and information can then be extracted from that voxelised space. Structural information can be extracted using 3-D metrics from local areas and there is a case study that used the voxelisation approach for detecting dead standing [[Eucalypt]] trees in Australia.<ref>{{Cite journal|date=2018-05-01|title=Detection of dead standing Eucalyptus camaldulensis without tree delineation for managing biodiversity in native Australian forest|journal=International Journal of Applied Earth Observation and Geoinformation|volume=67|pages=135–147|doi=10.1016/j.jag.2018.01.008|issn=0303-2434|doi-access=free|last1=Miltiadou|first1=Milto|last2=Campbell|first2=Neil D.F.|last3=Gonzalez Aracil|first3=Susana|last4=Brown|first4=Tony|last5=Grant|first5=Michael G.|bibcode=2018IJAEO..67..135M|hdl=20.500.14279/19541|hdl-access=free}}</ref> Terrestrial applications of lidar (also ''terrestrial laser scanning'') happen on the Earth's surface and can be either stationary or mobile. Stationary terrestrial scanning is most common as a survey method, for example in conventional topography, monitoring, cultural heritage documentation and forensics.<ref name=":0" /> The [[Point cloud|3-D point clouds]] acquired from these types of scanners can be matched with [[digital image]]s taken of the scanned area from the scanner's location to create realistic looking 3-D models in a relatively short time when compared to other technologies. Each point in the [[point cloud]] is given the colour of the pixel from the image taken at the same location and direction as the laser beam that created the point. Terrestrial lidar mapping involves a process of [[Occupancy grid mapping|occupancy grid map generation]]. The process involves an array of cells divided into grids which employ a process to store the height values when lidar data falls into the respective grid cell. A binary map is then created by applying a particular threshold to the cell values for further processing. The next step is to process the radial distance and z-coordinates from each scan to identify which 3-D points correspond to each of the specified grid cell leading to the process of data formation.<ref name=":7">{{cite book | chapter-url=https://doi.org/10.1109/ICIP.2010.5651197 | doi=10.1109/ICIP.2010.5651197 | chapter=A real-time grid map generation and object classification for ground-based 3D LIDAR data using image analysis techniques | title=2010 IEEE International Conference on Image Processing | date=2010 | last1=Lee | first1=Sang-Mook | last2=Im | first2=Jeong Joon | last3=Lee | first3=Bo-Hee | last4=Leonessa | first4=Alexander | last5=Kurdila | first5=Andrew | pages=2253–2256 | isbn=978-1-4244-7992-4 }}</ref> Mobile lidar (also ''mobile laser scanning'') is when two or more scanners are attached to a moving vehicle to collect data along a path. These scanners are almost always paired with other kinds of equipment, including [[Satellite navigation|GNSS]] receivers and [[Inertial measurement unit|IMUs]]. One example application is surveying streets, where power lines, exact bridge heights, bordering trees, etc. all need to be taken into account. Instead of collecting each of these measurements individually in the field with a [[Tachymeter (survey)|tachymeter]], a 3-D model from a point cloud can be created where all of the measurements needed can be made, depending on the quality of the data collected. This eliminates the problem of forgetting to take a measurement, so long as the model is available, reliable and has an appropriate level of accuracy. ==Applications== [[File:LIDAR equipped mobile robot.jpg|thumb|This [[mobile robot]] uses its lidar to construct a map and avoid obstacles.]] There are a wide variety of lidar applications, in addition to the applications listed below, as it is often mentioned in [[National lidar dataset|national lidar dataset]] programs. These applications are largely determined by the range of effective object detection; resolution, which is how accurately the lidar identifies and classifies objects; and reflectance confusion, meaning how well the lidar can see something in the presence of bright objects, like reflective signs or bright sun.<ref name=":14" /> Companies are working to cut the cost of lidar sensors, currently anywhere from about US$1,200 to more than $12,000. Lower prices will make lidar more attractive for new markets.<ref>{{Cite news|url=https://www.reuters.com/article/us-tech-ces-lidar/lidar-laser-sensing-technology-from-self-driving-cars-to-dance-contests-idUSKBN1Z62AS|title=Lidar laser-sensing technology: From self-driving cars to dance contests|newspaper=Reuters|date=7 January 2020}}</ref> ===Agriculture=== [[File:LIDAR field yield.jpg|thumb|alt=Graphic of a lidar return, featuring different crop yield rates|Lidar is used to analyze yield rates on agricultural fields.]] [[Agricultural robot]]s have been used for a variety of purposes ranging from seed and fertilizer dispersions, sensing techniques as well as crop scouting for the task of [[weed control]]. Lidar can help determine where to apply costly fertilizer. It can create a topographical map of the fields and reveal slopes and sun exposure of the farmland. Researchers at the [[Agricultural Research Service]] used this topographical data with the farmland yield results from previous years, to categorize land into zones of high, medium, or low yield.<ref>{{cite web |url=http://www.ars.usda.gov/is/pr/2010/100609.htm |title=ARS Study Helps Farmers Make Best Use of Fertilizers |publisher=USDA Agricultural Research Service |date=June 9, 2010}}</ref> This indicates where to apply fertilizer to maximize yield. Lidar is now used to monitor insects in the field. The use of lidar can detect the movement and behavior of individual flying insects, with identification down to sex and species.<ref>{{Cite book|first1=Alem|last1=Gebru|first2=Samuel|last2=Jansson|first3=Rickard|last3=Ignell|first4=Carsten|last4=Kirkeby|first5=Mikkel|last5=Brydegaard|title=Conference on Lasers and Electro-Optics |chapter=Multispectral polarimetric modulation spectroscopy for species and sex determination of Malaria disease vectors |date=2017-05-14 |pages=ATh1B.2 |url=https://www.osapublishing.org/abstract.cfm?uri=CLEO_AT-2017-ATh1B.2 |publisher=Optical Society of America |doi=10.1364/CLEO_AT.2017.ATh1B.2|isbn=978-1-943580-27-9|s2cid=21537355}}</ref> In 2017 a patent application was published on this technology in the United States, Europe, and China.<ref>{{cite web |url=https://patents.google.com/patent/WO2017182440A1/en |website=Google Patents |access-date=4 June 2019|title=Improvements in or relating to optical remote sensing systems for aerial and aquatic fauna, and use thereof }}</ref> Another application is crop mapping in orchards and vineyards, to detect foliage growth and the need for pruning or other maintenance, detect variations in fruit production, or count plants. Lidar is useful in [[GNSS]]-denied situations, such as nut and fruit orchards, where foliage causes [[Error analysis for the Global Positioning System|interference]] for agriculture equipment that would otherwise utilize a precise GNSS fix. Lidar sensors can detect and track the relative position of rows, plants, and other markers so that farming equipment can continue operating until a GNSS fix is reestablished. Controlling weeds requires identifying plant species. This can be done by using 3-D lidar and machine learning.<ref name=":8">{{Cite book|last1=Weiss|first1=Ulrich|last2=Biber|first2=Peter|last3=Laible|first3=Stefan|last4=Bohlmann|first4=Karsten|last5=Zell|first5=Andreas|chapter=Plant Species Classification using a 3D LIDAR Sensor and Machine Learning|title=2010 International Conference on Machine Learning and Applications |year=2010 |isbn=978-1-4244-9211-4}}</ref> Lidar produces plant contours as a "point cloud" with range and reflectance values. This data is transformed, and features are extracted from it. If the species is known, the features are added as new data. The species is labelled and its features are initially stored as an example to identify the species in the real environment. This method is efficient because it uses a low-resolution lidar and supervised learning. It includes an easy-to-compute feature set with common statistical features which are independent of the plant size.<ref name=":8" /> ===Archaeology=== Lidar has many uses in archaeology, including planning of field campaigns, mapping features under forest canopy, and overview of broad, continuous features indistinguishable from the ground.<ref>{{cite web|url=https://www.unb.ca/passc/ImpactDatabase/images/whitecourt.htm |title=EID; crater beneath canopy |publisher=Unb.ca |date=2013-02-18 |access-date=2013-05-06}}</ref> Lidar can produce high-resolution datasets quickly and cheaply. Lidar-derived products can be easily integrated into a Geographic Information System (GIS) for analysis and interpretation. Lidar can also help to create high-resolution digital elevation models (DEMs) of archaeological sites that can reveal micro-topography that is otherwise hidden by vegetation. The intensity of the returned lidar signal can be used to detect features buried under flat vegetated surfaces such as fields, especially when mapping using the infrared spectrum. The presence of these features affects plant growth and thus the amount of infrared light reflected back.<ref>{{cite book|url=http://www.english-heritage.org.uk/publications/light-fantastic/|title=The Light Fantastic: Using airborne lidar in archaeological survey|publisher=[[English Heritage]]|year=2010|page=45}}</ref> For example, at [[Fort Beauséjour]] – Fort Cumberland National Historic Site, Canada, lidar discovered archaeological features related to the siege of the Fort in 1755. Features that could not be distinguished on the ground or through aerial photography were identified by overlaying hill shades of the DEM created with artificial illumination from various angles. Another example is work at [[Caracol]] by [[Arlen F. Chase|Arlen Chase]] and his wife [[Diane Zaino Chase]].<ref>{{cite news|url=https://www.nytimes.com/2010/05/11/science/11maya.html?pagewanted=all|title=Mapping Ancient Civilization, in a Matter of Days|author=John Nobel Wilford|date=2010-05-10|newspaper=New York Times|access-date=2010-05-11}}</ref> In 2012, lidar was used to search for the legendary city of [[La Ciudad Blanca]] or "City of the Monkey God" in the [[La Mosquitia (Honduras)|La Mosquitia]] region of the Honduran jungle. During a seven-day mapping period, evidence was found of man-made structures.<ref>{{cite news|author=Stephanie Pappas|title=Ruins of Lost City May Lurk Deep in Honduras Rain Forest|date=May 15, 2013|url=http://www.livescience.com/32017-lost-city-honduras-images.html|work=Live Science|access-date=May 15, 2013}}</ref><ref name="Preston2015">{{cite news|url=http://news-beta.nationalgeographic.com/2015/03/150302-honduras-lost-city-monkey-god-maya-ancient-archaeology/|archive-url=https://web.archive.org/web/20150303051331/http://news-beta.nationalgeographic.com/2015/03/150302-honduras-lost-city-monkey-god-maya-ancient-archaeology/|url-status=dead|archive-date=March 3, 2015|title=Lost City Discovered in the Honduran Rain Forest|author=Douglas Preston|date=2 Mar 2015|newspaper=National Geographic|access-date=3 March 2015}}</ref> In June 2013, the rediscovery of the city of [[Mahendraparvata]] was announced.<ref>{{cite web|url=https://www.smh.com.au/national/jungle-surrenders-its-lost-city-20130614-2oa9b.html |title=Jungle surrenders its lost city |website=Smh.com.au |date= 2013-06-14|access-date=2016-02-22}}</ref> In southern New England, lidar was used to reveal stone walls, building foundations, abandoned roads, and other landscape features obscured in aerial photography by the region's dense forest canopy.<ref>{{cite journal|title=Rediscovering the lost archaeological landscape of southern New England using airborne light detection and ranging (LiDAR) |doi=10.1016/j.jas.2013.12.004 |volume=43 |pages=9–20 |journal=Journal of Archaeological Science|year=2014 |last1=Johnson |first1=Katharine M |last2=Ouimet |first2=William B |bibcode=2014JArSc..43....9J }}</ref><ref>{{cite web|author=Edwin Cartlidge |url=https://www.science.org/content/article/lasers-unearth-lost-agropolis-new-england |title=Lasers Unearth Lost 'Agropolis' of New England | Science | AAAS |website=News.sciencemag.org |date= 2014-01-10|access-date=2016-02-22}}</ref><ref>{{cite web|url=http://news.nationalgeographic.com/news/2014/01/140103-new-england-archaeology-lidar-science |archive-url=https://web.archive.org/web/20140107195436/http://news.nationalgeographic.com/news/2014/01/140103-new-england-archaeology-lidar-science |url-status=dead |archive-date=January 7, 2014 |title="Lost" New England Revealed by High-Tech Archaeology |website=News.nationalgeographic.com |date=2014-01-03 |access-date=2016-02-22}}</ref> In Cambodia, lidar data was used by [[Damian Evans]] and Roland Fletcher to reveal anthropogenic changes to Angkor landscape.<ref>{{cite journal | last1 = Evans | first1 = D.H. | last2 = Fletcher | first2 = R.J. |display-authors=etal | year = 2013| title = Uncovering archaeological landscapes at Angkor using lidar | journal = PNAS | volume = 110 | issue = 31| pages = 12595–12600 | doi = 10.1073/pnas.1306539110 | pmid=23847206 | pmc=3732978| bibcode = 2013PNAS..11012595E | doi-access = free }}</ref> In 2012, lidar revealed that the [[Purépecha]] settlement of [[Angamuco]] in [[Michoacán]], Mexico had about as many buildings as today's Manhattan;<ref>{{Cite news|url=https://www.theguardian.com/science/2018/feb/15/laser-scanning-reveals-lost-ancient-mexican-city-had-as-many-buildings-as-manhattan|title=Laser scanning reveals 'lost' ancient Mexican city 'had as many buildings as Manhattan'|first=Nicola|last=Davis|date=February 15, 2018|via=www.theguardian.com|newspaper=The Guardian}}</ref> while in 2016, its use in mapping ancient Maya causeways in northern Guatemala, revealed 17 elevated roads linking the ancient city of [[El Mirador]] to other sites.<ref>{{Cite web|url=https://www.smithsonianmag.com/smart-news/lidar-scans-maya-network-roads-180961995/|title=LiDAR Scans Reveal Maya Civilization's Sophisticated Network of Roads|website=smithsonianmag.com |access-date=February 28, 2018}}</ref><ref>{{Cite web|url=https://www.seeker.com/ancient-mayan-superhighways-found-in-the-guatemala-jungle-2219303581.html|title=Ancient Mayan Superhighways Found in the Guatemala Jungle|date=2017-01-27}}</ref> In 2018, archaeologists using lidar discovered more than 60,000 man-made structures in the [[Maya Biosphere Reserve#Archaeology|Maya Biosphere Reserve]], a "major breakthrough" that showed the [[Maya civilization]] was much larger than previously thought.<ref>{{Cite news|url=https://news.nationalgeographic.com/2018/02/maya-laser-lidar-guatemala-pacunam/|archive-url=https://archive.today/20180201224439/https://news.nationalgeographic.com/2018/02/maya-laser-lidar-guatemala-pacunam/|url-status=dead|archive-date=February 1, 2018|title=This Ancient Civilization Was Twice As Big As Medieval England|date=2018-02-01|access-date=2018-02-05}}</ref><ref>{{Cite web|url=https://www.msn.com/en-sg/news/world/archaeologists-find-ancient-lost-cities-using-lasers/ar-BBNIdqB|title=Archaeologists Find Ancient Lost Cities Using Lasers|website=msn.com|access-date=2019-09-08}}</ref><ref>{{Cite web|url=https://www.nationalgeographic.com/news/2018/02/maya-laser-lidar-guatemala-pacunam/|archive-url=https://web.archive.org/web/20190807025257/https://www.nationalgeographic.com/news/2018/02/maya-laser-lidar-guatemala-pacunam/|url-status=dead|archive-date=August 7, 2019|title=This Ancient Civilization Was Twice As Big As Medieval England|date=2018-02-01|website=National Geographic News|access-date=2019-09-08}}</ref><ref>{{Cite news|url=https://www.bbc.com/news/world-latin-america-42916261|title=Sprawling Maya network discovered under Guatemala jungle|date=2018-02-02}}</ref><ref>{{Cite web|url=https://www.newsweek.com/archaeologists-find-ancient-cities-using-lasers-1145042|title=Archaeologists Find Ancient Mayan Lost Cities in Guatemala Using Lasers |date=2018-09-29|website=Newsweek }}</ref><ref>{{Cite web|url=https://www.history.com/news/ancient-maya-structures-guatemala-lasers|title=Lasers Reveal 60,000 Ancient Maya Structures in Guatemala|last=Little|first=Becky|website=History |access-date=2019-09-08}}</ref><ref>{{Cite web|url=https://www.yahoo.com/news/hidden-ancient-mayan-apos-megalopolis-110002872.html|title=Hidden Ancient Mayan 'Megalopolis' With 60,000 Structures Discovered in Guatemala Using Lasers|website=yahoo.com|access-date=2019-09-08|archive-date=2019-09-05|archive-url=https://web.archive.org/web/20190905113109/https://www.yahoo.com/news/hidden-ancient-mayan-apos-megalopolis-110002872.html|url-status=dead}}</ref><ref>{{Cite web|url=https://www.businessinsider.com/60000-lost-mayan-structures-found-beneath-guatemalan-jungle-2018-2|title=Archaeologists found thousands of hidden structures in the Guatemalan jungle – and it could re-write human history|last=Berke|first=Jeremy|website=Business Insider|access-date=2019-09-08|date=2018-02-02}}</ref><ref>{{Cite web|url=https://www.newsweek.com/hidden-ancient-mayan-megalopolis-60000-structures-discovered-guatemala-using-797865|title=Hidden Ancient Mayan 'Megalopis' with 60,000 Structures Discovered in Guatemala Using Lasers |date=2018-02-02|website=Newsweek }}</ref><ref>{{Cite web|url=https://olodonation.com/2018/09/30/archaeologists-discover-ancient-mayan-lost-city-in-northern-guatemala-using-lasers/|title=Archaeologists Discover Ancient Mayan Lost City In Northern Guatemala Using Lasers|last=Chukwurah|first=Precious|date=2018-09-30|website=Nigeria's Entertainment News, Music, Video, Lifestyle|access-date=2019-09-08}}</ref><ref>{{Cite web|url=https://bgr.com/2018/02/02/mayan-megacity-discovered-guatemala-jungle/|title=Archaeologists discovered an ancient Mayan megacity hidden in a Guatemalan jungle|last=Wehner|first=Mike|date=2018-02-02|website=BGR|access-date=2019-09-08}}</ref> In 2024, archaeologists using lidar discovered the [[Upano Valley sites]].<ref>{{cite web|url=https://apnews.com/article/amazon-lost-cities-ecuador-archaeology-8e48942ff8fd1c52611158696b74e214|title=A cluster of lost cities in Ecuadorian Amazon that lasted 1,000 years has been mapped|website=[[Associated Press]]|date=2024-01-11}}</ref><ref>{{cite journal|title=Two thousand years of garden urbanism in the Upper Amazon|url=https://www.science.org/doi/10.1126/science.adi6317|author1=Stéphen Rostain|author2=Antoine Dorison|author3=Geoffroy de Saulieu|author4=Heiko Prümers|author5=Jean-Luc Le Pennec|author6=Fernando Mejía Mejía|author7=Ana Maritza Freire|author8=Jaime R. Pagán-Jiménez|author9=Philippe Descola|date=2024-01-11|journal=[[Science (journal)|Science]]|volume=383|issue=6679|pages=183–189|doi=10.1126/science.adi6317|pmid=38207020 |bibcode=2024Sci...383..183R }}</ref> ===Autonomous vehicles=== [[File:Cruise Automation Bolt EV third generation in San Francisco.jpg|thumbnail|[[Cruise Automation]] self-driving car with five [[Velodyne Lidar]] units on the roof]] [[File:3D Sick Lidar.jpg|thumbnail|Forecast 3-D Laser System using a SICK LMC lidar sensor]] [[Autonomous car|Autonomous vehicles]] may use lidar for obstacle detection and avoidance to navigate safely through environments.<ref name=":13" /><ref>By Steve Taranovich, EDN. "[http://www.edn.com/design/analog/4442319/Autonomous-automotive-sensors—How-processor-algorithms-get-their-inputs Autonomous automotive sensors: How processor algorithms get their inputs]." July 5, 2016. Retrieved August 9, 2016.</ref> The introduction of lidar was a pivotal occurrence that was the key enabler behind [[Stanley (vehicle)|Stanley]], the first autonomous vehicle to successfully complete the [[DARPA Grand Challenge]].<ref>{{Cite magazine|title=The Oral History of the Darpa Challenge, the Grueling Robot Race That Launched the Self-Driving Car|magazine=Wired|url=https://www.wired.com/story/darpa-grand-challenge-2004-oral-history/|access-date=2020-12-24|issn=1059-1028}}</ref> Point cloud output from the lidar sensor provides the necessary data for robot software to determine where potential obstacles exist in the environment and where the robot is in relation to those potential obstacles. Singapore's ''Singapore-MIT Alliance for Research and Technology (SMART)'' is actively developing technologies for autonomous lidar vehicles.<ref>{{cite web|url=http://motioncars.inquirer.net/36858/filipino-turns-ordinary-car-into-autonomous-vehicle |title=Filipino turns ordinary car into autonomous vehicle – Motioncars |website=Motioncars.inquirer.net |date=2015-05-25 |access-date=2016-02-22}}</ref> The [[Dynamic Radar Cruise Control|very first generations]] of automotive [[adaptive cruise control]] systems used only lidar sensors. In transportation systems, to ensure vehicle and passenger safety and to develop electronic systems that deliver driver assistance, understanding the vehicle and its surrounding environment is essential. Lidar systems play an important role in the safety of transportation systems. Many electronic systems which enhance driver assistance and vehicle safety such as Adaptive Cruise Control (ACC), Emergency Brake Assist, and [[Anti-lock braking system|Anti-lock Braking System]] (ABS) depend on the detection of a vehicle's environment to act autonomously or semi-autonomously. Lidar mapping and estimation achieve this. Current lidar systems use rotating hexagonal mirrors which split the laser beam. The upper three beams are used for vehicle and obstacles ahead, and the lower beams are used to detect lane markings and road features.<ref name=":4">{{Cite book|last1=Takagi|first1=Kiyokazu|last2=Morikawa|first2=Katsuhiro|last3=Ogawa|first3=Takashi|last4=Saburi|first4=Makoto|title=2006 IEEE Intelligent Vehicles Symposium |chapter=Road Environment Recognition Using On-vehicle LIDAR |pages=120–125 |year=2006 |isbn= 978-4-901122-86-3|doi=10.1109/IVS.2006.1689615|s2cid=15568035}}</ref> The major advantage of using lidar is that the spatial structure is obtained and this data can be fused with other sensors such as [[radar]], etc. to get a better picture of the vehicle environment in terms of static and dynamic properties of the objects present in the environment. Conversely, a significant issue with lidar is the difficulty in reconstructing point cloud data in poor weather conditions. In heavy rain, for example, the light pulses emitted from the lidar system are partially reflected off of rain droplets which adds noise to the data, called 'echoes'.<ref>{{Cite book|title= 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC)|pages=2242–2247|doi=10.1109/ITSC.2016.7795918|chapter= Test methodology for rain influence on automotive surround sensors|year=2016|last1=Hasirlioglu|first1=Sinan|last2=Kamann|first2=Alexander|last3=Doric|first3=Igor|last4=Brandmeier|first4=Thomas|isbn=978-1-5090-1889-5|s2cid=2334608}}</ref> Obstacle detection and road environment recognition using lidar, proposed by Kun Zhou et al.<ref>{{Cite book|last1=Zhou|first1=Kun|last2=Wang|first2=Xiqin|last3=Tomizukat|first3=Masayoshi|last4=Zhang|first4=Wei-Bin|last5=Chant|first5=Ching-Yao|title=Proceedings of the 2002 American Control Conference (IEEE Cat. No.CH37301) |chapter=A new maneuvering target tracking algorithm with input estimation |volume=1 |pages=166–171 |year= 2002 |isbn=978-0-7803-7298-6|doi=10.1109/ACC.2002.1024798|s2cid=114167319|url=https://zenodo.org/record/1263011}}</ref> not only focuses on object detection and tracking but also recognizes lane marking and road features. As mentioned earlier, the lidar systems use rotating hexagonal mirrors that split the laser beam into six beams. The upper three layers are used to detect the forward objects such as vehicles and roadside objects. The sensor is made of weather-resistant material. The data detected by lidar are clustered to several segments and tracked by [[Kalman filter]]. Data clustering here is done based on characteristics of each segment based on object model, which distinguish different objects such as vehicles, signboards, etc. These characteristics include the dimensions of the object, etc. The reflectors on the rear edges of vehicles are used to differentiate vehicles from other objects. Object tracking is done using a two-stage Kalman filter considering the stability of tracking and the accelerated motion of objects<ref name=":4" /> Lidar reflective intensity data is also used for curb detection by making use of robust regression to deal with occlusions. Road marking is detected using a modified Otsu method by distinguishing rough and shiny surfaces.<ref>{{Cite journal|last1=Y. Hata|first1=Alberto|last2=F. Wolf|first2=Denis|title=Feature Detection for Vehicle Localization in Urban Environments Using a Multilayer LIDAR|journal=IEEE Transactions on Intelligent Transportation System|volume=17|issn=1558-0016|number=2}}</ref> Roadside reflectors that indicate lane border are sometimes hidden due to various reasons. Therefore, other information is needed to recognize the road border. The lidar used in this method can measure the reflectivity from the object. Hence, with this data the road border can also be recognized. Also, the usage of a sensor with a weather-robust head helps to detect the objects even in bad weather conditions. Canopy Height Model before and after a flood is a good example. Lidar can detect highly detailed canopy height data as well as its road border. Lidar measurements help identify the spatial structure of the obstacle. This helps distinguish objects based on size and estimate the impact of driving over it.<ref name=":5">{{Cite book|last1=Lindner|first1=Philipp|last2=Wanielik|first2=Gerd|date=2009 |title=2009 IEEE Workshop on Computational Intelligence in Vehicles and Vehicular Systems |chapter=3D LIDAR processing for vehicle safety and environment recognition |pages=66–71 |isbn= 978-1-4244-2770-3|doi=10.1109/CIVVS.2009.4938725|s2cid=18520919}}</ref> Lidar systems provide better range and a large field of view, which helps in detecting obstacles on the curves. This is one of its major advantages over [[Radar systems|RADAR systems]], which have a narrower field of view. The fusion of lidar measurement with different sensors makes the system robust and useful in real-time applications, since lidar-dependent systems cannot estimate the dynamic information about the detected object.<ref name=":5" /> It has been shown that lidar can be manipulated, such that self-driving cars are tricked into taking evasive action.<ref>{{Cite news|url=https://www.theguardian.com/technology/2015/sep/07/hackers-trick-self-driving-cars-lidar-sensor|title=Hackers can trick self-driving cars into taking evasive action|journal=The Guardian|first=Samuel|last=Gibbs|date=7 September 2015}}</ref> ===Ecology and conservation=== [[File:Forest LIDAR.jpg|thumb|Lidar imaging comparing old-growth forest (right) to a new plantation of trees (left)]] Lidar has also found many applications for mapping natural and managed landscapes such as forests, wetlands,<ref>{{cite journal |last1=Xu |first1=Haiqing |last2=Toman |first2=Elizabeth |last3=Zhao |first3=Kaiguang |last4=Baird |first4=John |title=Fusion of Lidar and Aerial Imagery to Map Wetlands and Channels via Deep Convolutional Neural Network |journal=Transportation Research Record |date=2022 |volume=2676 |issue=12 |pages=374–381 |doi=10.1177/03611981221095522 |s2cid=251780248 |url=https://journals.sagepub.com/doi/10.1177/03611981221095522}}</ref> and grasslands. [[canopy (forest)|Canopy]] heights, [[biomass]] measurements, and leaf area can all be studied using airborne lidar systems.<ref>{{Cite journal|last1=Naesset|first1=Erik|date=April 1997|title=Determination of mean tree height of forest stands using airborne laser scanner data|journal=ISPRS Journal of Photogrammetry and Remote Sensing|volume=52|issue=2|pages=49–56|doi=10.1016/S0924-2716(97)83000-6|bibcode=1997JPRS...52...49N }}</ref><ref>{{Cite journal|last1=Johnson|first1=Lucas|last2=Mahoney|first2=Michael|last3=Bevilacqua|first3=Eddie|last4=Stehman|first4=Stephen|last5=Domke|first5=Grant|last6=Beier|first6=Colin|date=November 2022|title=Fine-resolution landscape-scale biomass mapping using a spatiotemporal patchwork of LiDAR coverages|journal=International Journal of Applied Earth Observation and Geoinformation|volume=114|page=103059|doi=10.1016/j.jag.2022.103059|s2cid=248834425 |doi-access=free|arxiv=2205.08530}}</ref><ref>{{Cite journal|last1=Morsdorf|first1=Felix|last2=Kötz|first2=Benjamin|last3=Meier|first3=Erich|last4=Itten|first4=K.I.|last5=Allgöwer|first5=Britta|date=15 September 2006|title=Estimation of LAI and fractional cover from small footprint airborne laser scanning data based on gap fraction|journal=Remote Sensing of Environment|volume=104|issue=1|pages=50–61|doi=10.1016/j.rse.2006.04.019|bibcode=2006RSEnv.104...50M }}</ref><ref>{{cite journal |last1=Zhao |first1=Kaiguang |last2=Popescu |first2=Sorin |title=Lidar-based mapping of leaf area index and its use for validating GLOBCARBON satellite LAI product in a temperate forest of the southern USA |journal=Remote Sensing of Environment |date=2009 |volume=113 |issue=8 |pages=1628–1645 |doi=10.1016/j.rse.2009.03.006 |bibcode=2009RSEnv.113.1628Z |url=https://www.sciencedirect.com/science/article/pii/S0034425709000893}}</ref> Similarly, lidar is also used by many industries, including Energy and Railroad, and the Department of Transportation as a faster way of surveying. Topographic maps can also be generated readily from lidar, including for recreational use such as in the production of [[orienteering]] maps.<ref>{{cite web|url=http://www.lidarbasemaps.org |title=Lidar Links of Use in Mapping |website=Lidarbasemaps.org |access-date=2016-02-22}}</ref> Lidar has also been applied to estimate and assess the biodiversity of plants, fungi, and animals.<ref>{{Cite journal|last1=Clawges|first1=Rick|last2=Vierling|first2=Kerri|last3=Vierling|first3=Lee|last4=Rowell|first4=Eric|date=15 May 2008|title=The use of airborne lidar to assess avian species diversity, density, and occurrence in a pine/aspen forest|journal=Remote Sensing of Environment|volume=112|issue=5|pages=2064–2073|doi=10.1016/j.rse.2007.08.023|bibcode=2008RSEnv.112.2064C|issn=0034-4257}}</ref><ref>{{Cite bioRxiv |last1=Moeslund|first1=Jesper Erenskjold|last2=Zlinszky|first2=András|last3=Ejrnæs|first3=Rasmus|last4=Brunbjerg|first4=Ane Kirstine|last5=Bøcher|first5=Peder Klith|last6=Svenning|first6=Jens-Christian|last7=Normand|first7=Signe|date=2019-01-04|title=LIDAR explains diversity of plants, fungi, lichens and bryophytes across multiple habitats and large geographic extent|biorxiv=10.1101/509794}}</ref><ref>{{Cite journal|last1=Simonson|first1=William D.|last2=Allen|first2=Harriet D.|last3=Coomes|first3=David A.|date=2014-07-05|title=Applications of airborne lidar for the assessment of animal species diversity|journal=Methods in Ecology and Evolution|volume=5|issue=8|pages=719–729|doi=10.1111/2041-210x.12219|issn=2041-210X|doi-access=free|bibcode=2014MEcEv...5..719S }}</ref> Using [[Durvillaea|southern bull kelp]] in New Zealand, coastal lidar mapping data has been compared with [[population genomics|population genomic]] evidence to form hypotheses regarding the occurrence and timing of prehistoric earthquake uplift events.<ref name="Vaux-2023a">{{cite journal |doi=10.1098/rsif.2023.0105 |title=Integrating kelp genomic analyses and geological data to reveal ancient earthquake impacts |journal=Journal of the Royal Society Interface |volume= 20|issue= 202|pages= |year=2023 |last1=Vaux|first1=Felix |last2=Fraser|first2=Ceridwen I. |last3=Craw|first3=Dave |last4=Read|first4=Stephen |last5=Waters|first5=Jonathan M.|pmid= 37194268|pmc=10189309 |doi-access= }}</ref> ===Forestry=== [[File:Lidar forestry.png|thumb|A typical workflow to derive forest information at individual tree or plot levels from lidar point clouds<ref name="Zhao2019" />]] Lidar systems have also been applied to improve forestry management.<ref name="WulderBater2008">{{cite journal|last1=Wulder|first1=Michael A|last2=Bater|first2=Christopher W|last3=Coops|first3=Nicholas C|last4=Hilker|first4=Thomas|last5=White|first5=Joanne C|title=The role of LiDAR in sustainable forest management|journal=The Forestry Chronicle|volume=84|issue=6|year=2008|pages=807–826|issn=0015-7546|doi=10.5558/tfc84807-6|citeseerx=10.1.1.728.1314}}</ref> Measurements are used to take inventory in forest plots as well as calculate individual tree heights, crown width and crown diameter. Other statistical analysis use lidar data to estimate total plot information such as canopy volume, mean, minimum and maximum heights, vegetation cover, biomass, and carbon density.<ref name="Zhao2019">{{cite journal |last1=Zhao |first1=Kaiguang |last2=Suarez |first2=Juan C |last3=Garcia |first3=Mariano |last4=Hu |first4=Tongxi |last5=Wang |first5=Cheng |last6=Londo |first6=Alexis |title=Utility of multitemporal lidar for forest and carbon monitoring: Tree growth, biomass dynamics, and carbon flux |journal=Remote Sensing of Environment |date=2018 |volume=204 |pages=883–897 |doi=10.1016/j.rse.2017.09.007 |bibcode=2018RSEnv.204..883Z |url=https://go.osu.edu/biomass}}</ref> Aerial lidar was used to map the bush fires in Australia in early 2020. The data was manipulated to view bare earth, and identify healthy and burned vegetation.<ref>{{Cite web |url=https://www.airborneresearch.org.au/fires-2020 |title=FIRES 2020 | airborne |access-date=2020-02-10 |archive-date=2020-06-13 |archive-url=https://web.archive.org/web/20200613110743/https://www.airborneresearch.org.au/fires-2020}}</ref> ===Geology and soil science=== High-resolution [[digital elevation map]]s generated by airborne and stationary lidar have led to significant advances in [[geomorphology]] (the branch of geoscience concerned with the origin and evolution of the Earth surface topography). The lidar abilities to detect subtle topographic features such as river terraces and river channel banks,<ref>{{Cite journal |last1=Conesa-García |first1=Carmelo |last2=Puig-Mengual |first2=Carlos |last3=Riquelme |first3=Adrián |last4=Tomás |first4=Roberto |last5=Martínez-Capel |first5=Francisco |last6=García-Lorenzo |first6=Rafael |last7=Pastor |first7=José L. |last8=Pérez-Cutillas |first8=Pedro |last9=Martínez-Salvador |first9=Alberto |last10=Cano-Gonzalez |first10=Miguel |date=February 2022 |title=Changes in stream power and morphological adjustments at the event-scale and high spatial resolution along an ephemeral gravel-bed channel |url=https://doi.org/10.1016/j.geomorph.2021.108053 |journal=Geomorphology |volume=398 |pages=108053 |doi=10.1016/j.geomorph.2021.108053 |bibcode=2022Geomo.39808053C |issn=0169-555X|hdl=10251/190056 |hdl-access=free }}</ref> glacial landforms,<ref>{{cite journal |last1=Janowski |first1=Lukasz |last2=Tylmann |first2=Karol |last3=Trzcinska |first3=Karolina |last4=Rudowski |first4=Stanislaw |last5=Tegowski |first5=Jaroslaw |title=Exploration of Glacial Landforms by Object-Based Image Analysis and Spectral Parameters of Digital Elevation Model |journal=IEEE Transactions on Geoscience and Remote Sensing |date=2021 |volume=60 |pages=1–17 |doi=10.1109/TGRS.2021.3091771|doi-access=free }}</ref> to measure the land-surface elevation beneath the vegetation canopy, to better resolve spatial derivatives of elevation, to rockfall detection,<ref>{{Cite journal |last1=Tomás |first1=R. |last2=Abellán |first2=A. |last3=Cano |first3=M. |last4=Riquelme |first4=A. |last5=Tenza-Abril |first5=A. J. |last6=Baeza-Brotons |first6=F. |last7=Saval |first7=J. M. |last8=Jaboyedoff |first8=M. |date=2018-02-01 |title=A multidisciplinary approach for the investigation of a rock spreading on an urban slope |journal=Landslides |volume=15 |issue=2 |pages=199–217 |doi=10.1007/s10346-017-0865-0 |issn=1612-5118|doi-access=free |bibcode=2018Lands..15..199T |hdl=10045/73318 |hdl-access=free }}</ref><ref>{{Cite journal |last1=Tonini |first1=Marj |last2=Abellan |first2=Antonio |date=2014-06-30 |title=Rockfall detection from terrestrial LiDAR point clouds: A clustering approach using R |url=https://josis.org/index.php/josis/article/view/50 |journal=Journal of Spatial Information Science |issue=8 |pages=95–110 |doi=10.5311/josis.2014.8.123 |issn=1948-660X}}</ref> to detect elevation changes between repeat surveys<ref>{{Cite journal |last1=Hu |first1=Liuru |last2=Navarro-Hernández |first2=María I. |last3=Liu |first3=Xiaojie |last4=Tomás |first4=Roberto |last5=Tang |first5=Xinming |last6=Bru |first6=Guadalupe |last7=Ezquerro |first7=Pablo |last8=Zhang |first8=Qingtao |date=October 2022 |title=Analysis of regional large-gradient land subsidence in the Alto Guadalentín Basin (Spain) using open-access aerial LiDAR datasets |url=https://doi.org/10.1016/j.rse.2022.113218 |journal=Remote Sensing of Environment |volume=280 |pages=113218 |doi=10.1016/j.rse.2022.113218 |bibcode=2022RSEnv.28013218H |issn=0034-4257|hdl=10045/126163 |hdl-access=free }}</ref> have enabled many novel studies of the physical and chemical processes that shape landscapes.<ref>{{cite journal|author1=Hughes, M. W.|author2=Quigley, M. C|author3=van Ballegooy, S.|author4=Deam, B. L.|author5=Bradley, B. A.|author6=Hart, D. E.|date=2015|title=The sinking city: Earthquakes increase flood hazard in Christchurch, New Zealand|journal=GSA Today|volume=25 | issue = 3 |pages=4–10|url=https://www.geosociety.org/gsatoday/archive/25/3/article/i1052-5173-25-3-4.htm?rss=1|access-date=2016-02-22|doi=10.1130/Geology}}</ref> In 2005 the [[Tour Ronde]] in the [[Mont Blanc massif]] became the first high [[Alps|alpine mountain]] on which lidar was employed to monitor the increasing occurrence of severe rock-fall over large rock faces allegedly caused by [[climate change]] and degradation of permafrost at high altitude.<ref>{{cite journal|last1=Rabatel|first1=Antoine|last2=Deline|first2=Philip|last3=Jaillet|first3=Ste'phane|last4=Ravanel|first4=Ludovic|title=Rock falls in high-alpine rock walls quantified by terrestrial lidar measurements: A case study in the Mont Blanc area|journal=Geophysical Research Letters|date=28 May 2008|volume=35|issue=10|pages=L10502|doi=10.1029/2008GL033424|bibcode = 2008GeoRL..3510502R |s2cid=52085197 |doi-access=}}</ref> Lidar is also used in [[structural geology]] and geophysics as a combination between airborne lidar and [[GNSS]] for the detection and study of [[Fault (geology)|fault]]s, for measuring [[Tectonic uplift|uplift]].<ref>{{Cite journal|last1=Cunningham|first1=Dickson|last2=Grebby|first2=Stephen|last3=Tansey|first3=Kevin|last4=Gosar|first4=Andrej|last5=Kastelic|first5=Vanja|date=2006|title=Application of airborne LiDAR to mapping seismogenic faults in forested mountainous terrain, southeastern Alps, Slovenia|journal=Geophysical Research Letters|volume=33|issue=20|pages=L20308|doi=10.1029/2006GL027014|issn=1944-8007|bibcode=2006GeoRL..3320308C|url=http://eprints.nottingham.ac.uk/33910/1/Cunningham_et_al_2006.pdf |archive-url=https://ghostarchive.org/archive/20221009/http://eprints.nottingham.ac.uk/33910/1/Cunningham_et_al_2006.pdf |archive-date=2022-10-09 |url-status=live|doi-access=free}}</ref> The output of the two technologies can produce extremely accurate elevation models for terrain – models that can even measure ground elevation through trees. This combination was used most famously to find the location of the [[Seattle Fault]] in [[Washington (state)|Washington]], United States.<ref>{{cite web|url=http://www.seattlepi.com/local/19144_quake18.shtml |title=LIDAR shows where earthquake risks are highest |website=Seattlepi.com |date=2001-04-17 |access-date=2016-02-22}}</ref> This combination also measures uplift at [[Mount St. Helens]] by using data from before and after the 2004 uplift.<ref>[http://wagda.lib.washington.edu/data/type/elevation/lidar/st_helens/ 'Mount Saint Helens LIDAR Data', ''Washington State Geospatial Data Archive'' (September 13, 2006)]. Retrieved 8 August 2007.</ref> Airborne lidar systems monitor [[glacier]]s and have the ability to detect subtle amounts of growth or decline. A satellite-based system, the [[NASA]] [[ICESat]], includes a lidar sub-system for this purpose. The NASA Airborne Topographic Mapper<ref>[http://atm.wff.nasa.gov/ 'Airborne Topographic Mapper', ''NASA.gov'']. Retrieved 8 August 2007.</ref> is also used extensively to monitor [[glacier]]s and perform coastal change analysis. The combination is also used by soil scientists while creating a [[soil survey]]. The detailed terrain modeling allows soil scientists to see slope changes and landform breaks which indicate patterns in soil spatial relationships. ===Atmosphere=== [[File:Lidar-IGF-UW-2024 AB-02.jpg|thumb|Near range lidar at Institute of Geophysics, Warsaw, Poland]] <!--Differential absorption lidar redirects here--> {{main|Atmospheric lidar}} Initially, based on [[ruby laser]]s, lidar for meteorological applications was constructed shortly after the invention of the laser and represents one of the first applications of laser technology. Lidar technology has since expanded vastly in capability and lidar systems are used to perform a range of measurements that include profiling clouds, measuring winds, studying [[aerosol]]s, and quantifying various atmospheric components. Atmospheric components can in turn provide useful information including [[Atmospheric pressure|surface pressure]] (by measuring the absorption of [[oxygen]] or [[nitrogen]]), [[greenhouse gas]] emissions ([[carbon dioxide]] and [[methane]]), [[photosynthesis]] (carbon dioxide), [[wildfire|fires]] ([[carbon monoxide]]), and [[humidity]] ([[water vapor]]). Atmospheric lidars can be either ground-based, airborne or satellite-based depending on the type of measurement. Atmospheric lidar [[remote sensing]] works in two ways: # by measuring [[backscatter]] from the atmosphere, and # by measuring the scattered reflection off the ground (when the lidar is airborne) or other hard surface. Backscatter from the atmosphere directly gives a measure of clouds and aerosols. Other derived measurements from backscatter such as winds or cirrus ice crystals require careful selecting of the wavelength and/or polarization detected. ''Doppler lidar'' and ''Rayleigh Doppler lidar'' are used to measure temperature and wind speed along the beam by measuring the frequency of the backscattered light. The [[Doppler broadening]] of gases in motion allows the determination of properties via the resulting frequency shift.<ref>{{cite journal | last1 = Li | year = 2011 | first1 = T. |title = Middle atmosphere temperature trend and solar cycle revealed by long-term Rayleigh lidar observations |journal = Journal of Geophysical Research| volume = 116 | pages = D00P05 |doi=10.1029/2010jd015275 | bibcode=2011JGRD..116.0P05L| url = https://hal.archives-ouvertes.fr/hal-00594272/file/2010JD015275.pdf | doi-access = free }}</ref> Scanning lidars, such as [[NASA]]'s conical-scanning HARLIE, have been used to measure atmospheric wind velocity.<ref>[http://harlie.gsfc.nasa.gov/IHOP2002/Pub&Pats/AMOS%202002%20final.pdf Thomas D. Wilkerson, Geary K. Schwemmer, and Bruce M. Gentry. ''LIDAR Profiling of Aerosols, Clouds, and Winds by Doppler and Non-Doppler Methods'', NASA International H2O Project (2002)] {{webarchive|url=https://web.archive.org/web/20070822232155/http://harlie.gsfc.nasa.gov/IHOP2002/Pub%26Pats/AMOS%202002%20final.pdf |archive-url=https://ghostarchive.org/archive/20221009/http://harlie.gsfc.nasa.gov/IHOP2002/Pub%26Pats/AMOS%202002%20final.pdf |archive-date=2022-10-09 |url-status=live |date=2007-08-22 }}.</ref> The [[ESA]] wind mission ''ADM-Aeolus'' will be equipped with a Doppler lidar system in order to provide global measurements of vertical wind profiles.<ref>[http://www.esa.int/esaLP/ESAES62VMOC_LPadmaeolus_0.html 'Earth Explorers: ADM-Aeolus', ''ESA.org'' (European Space Agency, 6 June 2007)]. Retrieved 8 August 2007.</ref> A doppler lidar system was used in the [[2008 Summer Olympics]] to measure wind fields during the yacht competition.<ref>[http://optics.org/cws/article/research/34878 'Doppler lidar gives Olympic sailors the edge', ''Optics.org'' (3 July, 2008)]. Retrieved 8 July 2008.</ref> Doppler lidar systems are also now beginning to be successfully applied in the renewable energy sector to acquire wind speed, turbulence, wind veer, and wind shear data. Both pulsed and continuous wave systems are being used. Pulsed systems use signal timing to obtain vertical distance resolution, whereas continuous wave systems rely on detector focusing. The term ''eolics'' has been proposed to describe the collaborative and interdisciplinary study of wind using computational fluid mechanics simulations and Doppler lidar measurements.<ref>Clive, P. J. M., [https://web.archive.org/web/20140512215311/http://www.sgurrenergy.com/wp-content/uploads/2014/04/TEDxUniversityOfStrathclyde_TheEmergenceOfEolics.pdf The emergence of eolics], TEDx University of Strathclyde (2014). Retrieved 9 May 2014.</ref> The ground reflection of an airborne lidar gives a measure of surface reflectivity (assuming the atmospheric transmittance is well known) at the lidar wavelength, however, the ground reflection is typically used for making absorption measurements of the atmosphere. "Differential absorption lidar" (DIAL) measurements utilize two or more closely spaced (less than 1 nm) wavelengths to factor out surface reflectivity as well as other transmission losses, since these factors are relatively insensitive to wavelength. When tuned to the appropriate absorption lines of a particular gas, DIAL measurements can be used to determine the concentration (mixing ratio) of that particular gas in the atmosphere. This is referred to as an ''Integrated Path Differential Absorption'' (IPDA) approach, since it is a measure of the integrated absorption along the entire lidar path. IPDA lidars can be either pulsed<ref name=":1">{{cite journal | last1 = Koch | first1 = Grady J. | last2 = Barnes | first2 = Bruce W | last3 = Petros | first3 = Mulugeta | last4 = Beyon | first4 = Jeffrey Y | last5 = Amzajerdian | first5 = Farzin | last6 = Yu | first6 = Jirong | last7 = Davis | first7 = Richard E | last8 = Ismail | first8 = Syed | last9 = Vay | first9 = Stephanie | last10 = Kavaya | first10 = Michael J | last11 = Singh | first11 = Upendra N | year = 2004| title = Coherent Differential Absorption Lidar Measurements of CO2 | journal = Applied Optics | volume = 43 | issue = 26| pages = 5092–5099 | doi = 10.1364/AO.43.005092 | pmid = 15468711 |bibcode = 2004ApOpt..43.5092K }}</ref><ref name=":2">{{Cite journal|last1=Abshire|first1=James B.|last2=Ramanathan|first2=Anand|last3=Riris|first3=Haris|last4=Mao|first4=Jianping|last5=Allan|first5=Graham R.|last6=Hasselbrack|first6=William E.|last7=Weaver|first7=Clark J.|last8=Browell|first8=Edward V.|date=2013-12-30|title=Airborne Measurements of CO2 Column Concentration and Range Using a Pulsed Direct-Detection IPDA Lidar|journal=Remote Sensing|volume=6|issue=1|pages=443–469|doi=10.3390/rs6010443|bibcode=2013RemS....6..443A|doi-access=free|hdl=2060/20150008257|hdl-access=free}}</ref> or CW<ref name=":3">{{cite journal | last1 = Campbell | first1 = Joel F. | year = 2013| title = Nonlinear swept frequency technique for CO2 measurements using a CW laser system | journal = Applied Optics | volume = 52 | issue = 13| pages = 3100–3107 | doi = 10.1364/AO.52.003100 | pmid = 23669780 |arxiv = 1303.4933 |bibcode = 2013ApOpt..52.3100C | s2cid = 45261286 }}</ref> and typically use two or more wavelengths.<ref>{{cite journal | last1 = Dobler | first1 = Jeremy T. | last2 = Harrison | first2 = F. Wallace | last3 = Browell | first3 = Edward V. | last4 = Lin | first4 = Bing | last5 = McGregor | first5 = Doug | last6 = Kooi | first6 = Susan | last7 = Choi | first7 = Yonghoon | last8 = Ismail | first8 = Syed | year = 2013| title = Atmospheric CO2 column measurements with an airborne intensity-modulated continuous wave 1.57 μm fiber laser lidar | journal = Applied Optics | volume = 52 | issue = 12| pages = 2874–2892 | doi = 10.1364/AO.52.002874 | pmid = 23669700 |bibcode = 2013ApOpt..52.2874D | s2cid = 13713360 }}</ref> IPDA lidars have been used for remote sensing of carbon dioxide<ref name=":1"/><ref name=":2" /><ref name=":3"/> and methane.<ref>{{Cite journal|last1=Riris|first1=Haris|last2=Numata|first2=Kenji|last3=Li|first3=Steve|last4=Wu|first4=Stewart|last5=Ramanathan|first5=Anand|last6=Dawsey|first6=Martha|last7=Mao|first7=Jianping|last8=Kawa|first8=Randolph|last9=Abshire|first9=James B.|date=2012-12-01|title=Airborne measurements of atmospheric methane column abundance using a pulsed integrated-path differential absorption lidar|journal=Applied Optics|volume=51|issue=34|doi=10.1364/AO.51.008296|pmid=23207402|issn=1539-4522|bibcode = 2012ApOpt..51.8296R|pages=8296–305|s2cid=207299203}}</ref> ''[[Synthetic array heterodyne detection|Synthetic array]] lidar'' allows imaging lidar without the need for an array detector. It can be used for imaging Doppler velocimetry, ultra-fast frame rate imaging (millions of frames per second), as well as for [[speckle pattern|speckle]] reduction in coherent lidar.<ref name="Strauss">{{cite journal | last1 = Strauss | first1 = C. E. M. | year = 1994 | title = Synthetic-array heterodyne detection: a single-element detector acts as an array | journal = Optics Letters| volume = 19 | issue = 20| pages = 1609–1611 | doi=10.1364/ol.19.001609|bibcode = 1994OptL...19.1609S | pmid=19855597| url = https://zenodo.org/record/1235660 }}</ref> An extensive lidar bibliography for atmospheric and hydrospheric applications is given by Grant.<ref>Grant, W. B., Lidar for atmospheric and hydrospheric studies, in ''Tunable Laser Applications'', 1st Edition, [[F. J. Duarte|Duarte, F. J.]] Ed. (Marcel Dekker, New York, 1995) Chapter 7.</ref> ===Law enforcement=== {{see also|Lidar speed gun}} Lidar speed guns are used by the police to measure the speed of vehicles for [[speed limit enforcement]] purposes. Additionally, it is used in forensics to aid in crime scene investigations. Scans of a scene are taken to record exact details of object placement, blood, and other important information for later review. These scans can also be used to determine bullet trajectory in cases of shootings. ===Military=== Few military applications are known to be in place and are classified (such as the lidar-based speed measurement of the [[AGM-129 ACM]] stealth nuclear cruise missile), but a considerable amount of research is underway in their use for imaging. Higher resolution systems collect enough detail to identify targets, such as [[tank]]s. Examples of military applications of lidar include the Airborne Laser Mine Detection System (ALMDS) for counter-mine warfare by Areté Associates.<ref>{{Cite web|url=http://arete.com/|archive-url=https://web.archive.org/web/20110904075643/http://www.arete.com/index.php?view=stil_mcm|title=Areté|archive-date=September 4, 2011}}</ref> A NATO report (RTO-TR-SET-098) evaluated the potential technologies to do stand-off detection for the discrimination of biological warfare agents. The potential technologies evaluated were Long-Wave Infrared (LWIR), Differential Scattering (DISC), and Ultraviolet Laser Induced Fluorescence (UV-LIF). The report concluded that : ''Based upon the results of the lidar systems tested and discussed above, the Task Group recommends that the best option for the near-term (2008–2010) application of stand-off detection systems is UV-LIF '',<ref>{{cite web |url=http://www.rta.nato.int/pubs/rdp.asp?RDP=RTO-TR-SET-098 |title=NATO Laser Based Stand-Off Detection of biological Agents |publisher=Rta.nato.int |access-date=2013-05-06 |archive-url=https://web.archive.org/web/20110720141859/http://www.rta.nato.int/pubs/rdp.asp?RDP=RTO-TR-SET-098 |archive-date=2011-07-20 }}</ref> however, in the long-term, other techniques such as stand-off [[Raman spectroscopy]] may prove to be useful for identification of biological warfare agents. Short-range compact spectrometric lidar based on Laser-Induced Fluorescence (LIF) would address the presence of bio-threats in aerosol form over critical indoor, semi-enclosed and outdoor venues such as stadiums, subways, and airports. This near real-time capability would enable rapid detection of a bioaerosol release and allow for timely implementation of measures to protect occupants and minimize the extent of contamination.<ref>{{cite web|url=http://www.ino.ca/en-CA/Achievements/Description/project-p/short-range-bioaerosol-threat-detection.html |title=Short-Range Bioaerosol Threat Detection Sensor (SR-BioSpectra) |publisher=Ino.ca |access-date=2013-05-06}}</ref> The Long-Range Biological Standoff Detection System (LR-BSDS) was developed for the U.S. Army to provide the earliest possible standoff warning of a biological attack. It is an airborne system carried by helicopter to detect synthetic aerosol clouds containing biological and chemical agents at long range. The LR-BSDS, with a detection range of 30 km or more, was fielded in June 1997.<ref>{{cite web|url=http://articles.janes.com/articles/Janes-Nuclear,-Biological-and-Chemical-Defence/LR-BSDS--Long-Range-Biological-Standoff-Detection-System-United-States.html |title=Defense & Security Intelligence & Analysis: IHS Jane's | IHS |website=Articles.janes.com |access-date=2016-02-22}}</ref> Five lidar units produced by the German company [[Sick AG]] were used for short range detection on [[Stanley (vehicle)|Stanley]], the [[driverless car|autonomous car]] that won the 2005 [[DARPA Grand Challenge]]. A robotic [[Boeing AH-6]] performed a fully autonomous flight in June 2010, including avoiding obstacles using lidar.<ref name="cmu">Spice, Byron. [http://www.cmu.edu/news/blog/2010/Summer/unprecedented-robochopper.shtml Researchers Help Develop Full-Size Autonomous Helicopter] {{webarchive|url=https://web.archive.org/web/20110608055751/http://www.cmu.edu/news/blog/2010/Summer/unprecedented-robochopper.shtml |date=2011-06-08 }} ''Carnegie Mellon'', 6 July 2010. Retrieved: 19 July 2010.</ref><ref name="wired">Koski, Olivia. [https://www.wired.com/dangerroom/2010/07/in-a-first-full-sized-robo-copter-flies-with-no-human-help/ In a First, Full-Sized Robo-Copter Flies With No Human Help] ''Wired'', 14 July 2010. Retrieved: 19 July 2010.</ref> ===Mining=== The calculation of ore volumes is accomplished by periodic (monthly) scanning in areas of ore removal, then comparing surface data to the previous scan.<ref>{{cite web |url=http://3dlasermapping.com/index.php/mining-monitoring-applications/volume-measuring |title=Volume Measuring |publisher=3dlasermapping.com |access-date=2014-02-17 |archive-url=https://web.archive.org/web/20140404105837/http://3dlasermapping.com/index.php/mining-monitoring-applications/volume-measuring |archive-date=2014-04-04 }}</ref> Lidar sensors may also be used for obstacle detection and avoidance for robotic mining vehicles such as in the Komatsu Autonomous Haulage System (AHS)<ref>[[Modular Mining Systems#Autonomous Haulage Systems]]</ref> used in Rio Tinto's Mine of the Future. ===Physics and astronomy=== A worldwide network of observatories uses [[Lunar Laser Ranging Experiment|lidars to measure the distance to reflectors placed on the Moon]], allowing the position of the Moon to be measured with millimeter precision and [[tests of general relativity]] to be done. [[Mars Orbital Laser Altimeter|MOLA]], the [[Mars]] Orbiting Laser Altimeter, used a lidar instrument in a Mars-orbiting satellite (the NASA [[Mars Global Surveyor]]) to produce a spectacularly precise global topographic survey of the red planet. Laser altimeters produced global elevation models of Mars, the Moon (Lunar Orbiter Laser Altimeter (LOLA)), and Mercury (Mercury Laser Altimeter (MLA), NEAR–Shoemaker Laser Rangefinder (NLR)).<ref name="Hargitai 2019 147–174">{{Citation|last1=Hargitai|first1=Henrik|title=Methods in Planetary Topographic Mapping: A Review|date=2019|work=Planetary Cartography and GIS|pages=147–174|editor-last=Hargitai|editor-first=Henrik|series=Lecture Notes in Geoinformation and Cartography|publisher=Springer International Publishing|doi=10.1007/978-3-319-62849-3_6|isbn=978-3-319-62849-3|last2=Willner|first2=Konrad|last3=Buchroithner|first3=Manfred|s2cid=133855780}}</ref> Future missions will also include laser altimeter experiments such as the Ganymede Laser Altimeter (GALA) as part of the Jupiter Icy Moons Explorer (JUICE) mission.<ref name="Hargitai 2019 147–174"/> In September, 2008, the NASA [[Phoenix (spacecraft)|''Phoenix'' lander]] used lidar to detect snow in the atmosphere of Mars.<ref>[http://www.nasa.gov/mission_pages/phoenix/news/phoenix-20080929.html NASA. 'NASA Mars Lander Sees Falling Snow, Soil Data Suggest Liquid Past' ''NASA.gov'' (29 September 2008)] {{Webarchive|url=https://web.archive.org/web/20120727085221/http://www.nasa.gov/mission_pages/phoenix/news/phoenix-20080929.html |date=27 July 2012 }}. Retrieved 9 November 2008.</ref> In atmospheric physics, lidar is used as a remote detection instrument to measure densities of certain constituents of the middle and upper atmosphere, such as [[potassium]], [[sodium]], or molecular [[nitrogen]] and [[oxygen]]. These measurements can be used to calculate temperatures. Lidar can also be used to measure wind speed and to provide information about vertical distribution of the [[aerosol]] particles.<ref>{{cite web|title=Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP)|url=http://www-calipso.larc.nasa.gov/about/payload.php#CALIOP|publisher=NASA|access-date=16 August 2015}}</ref> At the [[Joint European Torus|JET]] [[nuclear fusion]] research facility in the UK near [[Abingdon-on-Thames|Abingdon]], Oxfordshire, lidar [[Thomson scattering]] is used to determine [[electron]] density and temperature profiles of the [[Plasma (physics)|plasma]].<ref>[http://www.jet.efda.org/pages/focus/lidar/index.html CW Gowers. ' Focus On : Lidar-Thomson Scattering Diagnostic on JET' ''JET.EFDA.org'' (undated)]. Retrieved 8 August 2007. {{webarchive|url=https://web.archive.org/web/20070918224524/http://www.jet.efda.org/pages/focus/lidar/index.html|date=September 18, 2007}}</ref> ===Rock mechanics=== Lidar has been widely used in rock mechanics for rock mass characterization and slope change detection. Some important geomechanical properties from the rock mass can be extracted from the 3-D point clouds obtained by means of the lidar. Some of these properties are: * Discontinuity orientation<ref>{{cite journal | last1 = Riquelme | first1 = A.J. | last2 = Abellán | first2 = A. | last3 = Tomás | first3 = R. | last4 = Jaboyedoff | first4 = M. | year = 2014 | title = A new approach for semi-automatic rock mass joints recognition from 3D point clouds | url = http://rua.ua.es/dspace/bitstream/10045/36557/1/2014_Riquelme_etal_Computers%26Geosciences.pdf| journal = Computers & Geosciences | volume = 68 | pages = 38–52 | doi = 10.1016/j.cageo.2014.03.014 | bibcode = 2014CG.....68...38R | hdl = 10045/36557 | hdl-access = free }}</ref><ref>{{cite journal | last1 = Gigli | first1 = G. | last2 = Casagli | first2 = N. | year = 2011 | title = Semi-automatic extraction of rock mass structural data from high resolution LIDAR point clouds | journal = International Journal of Rock Mechanics and Mining Sciences | volume = 48 | issue = 2| pages = 187–198 | doi = 10.1016/j.ijrmms.2010.11.009 | bibcode = 2011IJRMM..48..187G }}</ref><ref name="Slob, S 2010">Slob, S. 2010. Automated rock mass characterization using 3D terrestrial laser scanner, Technical University of Delf.</ref> * Discontinuity spacing and RQD<ref name="Slob, S 2010" /><ref>{{cite journal | last1 = Riquelme | first1 = A.J. | last2 = Abellán | first2 = A. | last3 = Tomás | first3 = R. | year = 2015 | title = Discontinuity spacing analysis in rock masses using 3D point clouds | journal = Engineering Geology | volume = 195 | pages = 185–195 | doi = 10.1016/j.enggeo.2015.06.009 | bibcode = 2015EngGe.195..185R | hdl = 10045/47912 | hdl-access = free }}</ref><ref name="Riquelme">{{cite journal | last1 = Sturzenegger | first1 = M. | last2 = Stead | first2 = D. | year = 2009 | title = Close-range terrestrial digital photogrammetry and terrestrial laser scanning for discontinuity characterization on rock cuts | journal = Engineering Geology | volume = 106 | issue = 3–4| pages = 163–182 | doi = 10.1016/j.enggeo.2009.03.004 | bibcode = 2009EngGe.106..163S }}</ref> * Discontinuity aperture * Discontinuity persistence<ref name="Slob, S 2010" /><ref name="Riquelme" /><ref>{{Cite journal|last1=Riquelme|first1=Adrián|last2=Tomás|first2=Roberto|last3=Cano|first3=Miguel|last4=Pastor|first4=José Luis|last5=Abellán|first5=Antonio|date=2018-05-24|title=Automatic Mapping of Discontinuity Persistence on Rock Masses Using 3D Point Clouds|journal=Rock Mechanics and Rock Engineering|volume=51|issue=10|pages=3005–3028|bibcode=2018RMRE...51.3005R|doi=10.1007/s00603-018-1519-9|s2cid=135109573|issn=0723-2632|url=http://eprints.whiterose.ac.uk/131301/1/RMRE-D-18-00095%20Automated%20Persistence%20Author%20copy.pdf |archive-url=https://ghostarchive.org/archive/20221009/http://eprints.whiterose.ac.uk/131301/1/RMRE-D-18-00095%20Automated%20Persistence%20Author%20copy.pdf |archive-date=2022-10-09 |url-status=live}}</ref> * Discontinuity roughness<ref name="Riquelme" /> * Water infiltration Some of these properties have been used to assess the geomechanical quality of the rock mass through the [[Rock mass rating|RMR]] index. Moreover, as the orientations of discontinuities can be extracted using the existing methodologies, it is possible to assess the geomechanical quality of a rock slope through the [[Slope Mass Rating|SMR]] index.<ref>{{Cite journal|last1=Riquelme|first1=Adrián J.|last2=Tomás|first2=Roberto|last3=Abellán|first3=Antonio|date=2016-04-01|title=Characterization of rock slopes through slope mass rating using 3D point clouds|journal=International Journal of Rock Mechanics and Mining Sciences|volume=84|pages=165–176|doi=10.1016/j.ijrmms.2015.12.008|bibcode=2016IJRMM..84..165R |hdl=10045/52313|hdl-access=free}}</ref> In addition to this, the comparison of different 3-D point clouds from a slope acquired at different times allows researchers to study the changes produced on the scene during this time interval as a result of rockfalls or any other landsliding processes.<ref>{{cite journal | last1 = Abellán | first1 = A. | last2 = Oppikofer | first2 = T. | last3 = Jaboyedoff | first3 = M. | last4 = Rosser | first4 = N.J. | last5 = Lim | first5 = M. | last6 = Lato | first6 = M.J. | year = 2014 | title = Terrestrial laser scanning of rock slope instabilities | url = https://zenodo.org/record/3409637| journal = [[Earth Surface Processes and Landforms]] | volume = 39 | issue = 1| pages = 80–97 | doi = 10.1002/esp.3493 | bibcode = 2014ESPL...39...80A | s2cid = 128876331 }}</ref><ref>{{cite journal | last1 = Abellán | first1 = A. | last2 = Vilaplana | first2 = J.M. | last3 = Martínez | first3 = J. | year = 2006 | title = Application of a long-range Terrestrial Laser Scanner to a detailed rockfall study at Vall de Núria (Eastern Pyrenees, Spain) | journal = Engineering Geology | volume = 88 | issue = 3–4| pages = 136–148 | doi = 10.1016/j.enggeo.2006.09.012 | bibcode = 2006EngGe..88..136A }}</ref><ref>{{Cite journal|last1=Tomás|first1=R.|last2=Abellán|first2=A.|last3=Cano|first3=M.|last4=Riquelme|first4=A.|last5=Tenza-Abril|first5=A. J.|last6=Baeza-Brotons|first6=F.|last7=Saval|first7=J. M.|last8=Jaboyedoff|first8=M.|date=2017-08-01|title=A multidisciplinary approach for the investigation of a rock spreading on an urban slope|journal=Landslides|volume=15|issue=2|pages=199–217|doi=10.1007/s10346-017-0865-0|issn=1612-510X|doi-access=free|bibcode=2018Lands..15..199T |hdl=10045/73318|hdl-access=free}}</ref> ===THOR=== THOR is a laser designed toward measuring Earth's atmospheric conditions. The laser enters a cloud cover<ref>{{Cite web|url=https://airbornescience.nasa.gov/category/type/Lidar|title=Lidar {{!}} NASA Airborne Science Program|website=airbornescience.nasa.gov|access-date=2017-03-20}}</ref> and measures the thickness of the return halo. The sensor has a fiber optic aperture with a width of {{convert|7+1/2|in|cm|0}} that is used to measure the return light. ===Robotics=== Lidar technology is being used in [[robotics]] for the perception of the environment as well as object classification.<ref>{{cite web|url=http://www.iftas.de/Main/Solutions |title=IfTAS |publisher=Iftas.de |access-date=2013-05-06}}</ref> The ability of lidar technology to provide three-dimensional elevation maps of the terrain, high precision distance to the ground, and approach velocity can enable safe landing of robotic and crewed vehicles with a high degree of precision.<ref name="auto"/> Lidar is also widely used in robotics for [[simultaneous localization and mapping]] and is well-integrated into robot simulators.<ref>{{cite web|url=https://www.cyberbotics.com/doc/guide/lidar-sensors |title=Lidar simulation models in Webots|access-date=2018-06-04}}</ref> Refer to the Military section above for further examples. ===Spaceflight=== Lidar is increasingly being utilized for [[Situation awareness|rangefinding]] and [[orbital element]] calculation of [[relative velocity]] in [[proximity operations]] and [[orbital stationkeeping|stationkeeping]] of [[spacecraft]]. Lidar has also been used for [[atmosphere|atmospheric]] studies from space. Short pulses of laser light beamed from a spacecraft can reflect off tiny particles in the atmosphere and back to a telescope aligned with the spacecraft laser. By precisely timing the lidar echo, and by measuring how much laser light is received by the telescope, scientists can accurately determine the location, distribution and nature of the particles. The result is a revolutionary new tool for studying constituents in the atmosphere, from cloud droplets to industrial pollutants, which are difficult to detect by other means.<ref>{{cite web|url=http://www.nasa.gov/centers/langley/news/factsheets/LITE.html |title=NASA – Lidar In-space Technology Experiment (LITE) |publisher=Nasa.gov |date=2011-08-25 |access-date=2013-05-06}}</ref><ref>{{Cite journal|author1=D.M. Winker |author2=R.H. Couch |author3=M.P. McCormick |title= An overview of LITE: NASA's Lidar In-space Technology Experiment|journal=Proceedings of the IEEE |volume=84 |issue=2 |pages=164–180 |doi=10.1109/5.482227 |date=2011-09-27 }}</ref> Laser altimetry is used to make [[Digital elevation model|digital elevation map]]s of planets, including the [[Mars Orbital Laser Altimeter]] (MOLA) mapping of Mars,<ref>Bruce Banerdt, [https://mars.nasa.gov/MPF/martianchronicle/martianchron3/marschro35.html Orbital Laser Altimeter], ''The Martian Chronicle, Volume 1'', No. 3, NASA.gov. Retrieved 11 March 2019.</ref> the [[Lunar Orbital Laser Altimeter]] (LOLA)<ref>NASA, [https://lola.gsfc.nasa.gov LOLA]. Retrieved 11 March 2019.</ref> and Lunar Altimeter (LALT) mapping of the Moon, and the Mercury Laser Altimeter (MLA) mapping of Mercury.<ref>John F. Cavanaugh, ''et al.,'' "[http://www-geodyn.mit.edu/cavanaugh.mla.ssr07.pdf The Mercury Laser Altimeter Instrument for the MESSENGER Mission]", ''Space Sci Rev'', {{doi|10.1007/s11214-007-9273-4}}, 24 August 2007. Retrieved 11 March 2019.</ref> It is also used to help navigate the [[Ingenuity (helicopter)|helicopter ''Ingenuity'']] in its record-setting flights over the terrain of [[Mars]].<ref name="ieee-2021" /> ===Surveying=== [[File:Mapping van tomtom with five lidars.jpg|thumb|This [[TomTom]] mapping van is fitted with five lidar sensors on its roof rack.]] Airborne lidar sensors are used by companies in the remote sensing field. They can be used to create a DTM (Digital Terrain Model) or DEM ([[Digital Elevation Model]]); this is quite a common practice for larger areas as a plane can acquire {{cvt|3|–|4|km|mi|frac=2}} wide swaths in a single flyover. Greater vertical accuracy of below {{cvt|50|mm|in|0}} can be achieved with a lower flyover, even in forests, where it is able to give the height of the canopy as well as the ground elevation. Typically, a GNSS receiver configured over a georeferenced control point is needed to link the data in with the [[World Geodetic System|WGS]] ([[World Geodetic System]]).<ref>{{Cite journal|title = Maritime Laser Scanning as the Source for Spatial Data|journal = Polish Maritime Research|volume = 22|issue = 4|pages = 9–14|doi = 10.1515/pomr-2015-0064|first1 = Jakub|last1 = Szulwic|first2 = Paweł|last2 = Burdziakowski|first3 = Artur|last3 = Janowski|first4 = Marek|last4 = Przyborski|first5 = Paweł|last5 = Tysiąc|first6 = Aleksander|last6 = Wojtowicz|first7 = Arthem|last7 = Kholodkov|first8 = Krzysztof|last8 = Matysik|first9 = Maciej|last9 = Matysik|year = 2015| bibcode=2015PMRes..22d...9S |doi-access = free}}</ref> Lidar is also in use in [[hydrographic survey]]ing. Depending upon the clarity of the water lidar can measure depths from {{cvt|0.9|to|40|m|ft|0}} with a vertical accuracy of {{cvt|15|cm|in|0}} and horizontal accuracy of {{cvt|2.5|m|ft|0}}.<ref>{{Cite web|url=http://home.iitk.ac.in/~blohani/LiDAR_Tutorial/Bathymetric%20LiDAR.htm|title=Bathymetric LiDAR|website=home.iitk.ac.in|access-date=2018-01-17}}</ref> ===Transport=== [[File:Ouster_OS1-64_lidar_point_cloud_of_intersection_of_Folsom_and_Dore_St,_San_Francisco.png|thumb|A point cloud generated from a moving car using a single [[Ouster (company)|Ouster]] OS1 lidar]] Lidar has been used in the railroad industry to generate asset health reports for asset management and by departments of transportation to assess their road conditions. CivilMaps.com is a leading company in the field.<ref>{{cite web |url=http://civilmaps.com/accelerating-road-and-pavement-condition-surveys-processing-implications/ |title=CivilMaps.com accelerating road and pavement condition surveys |publisher=Civil Maps |date=2015-03-15 |access-date=2015-03-15 |archive-url=https://web.archive.org/web/20150402111907/http://civilmaps.com/accelerating-road-and-pavement-condition-surveys-processing-implications/ |archive-date=2015-04-02 }}</ref> Lidar has been used in [[Adaptive Cruise Control|adaptive cruise control]] (ACC) systems for automobiles. Systems such as those by Siemens, Hella, Ouster and Cepton use a lidar device mounted on the front of the vehicle, such as the bumper, to monitor the distance between the vehicle and any vehicle in front of it.<ref>{{cite web|url=https://www.sciencedaily.com/releases/2007/02/070218131830.htm |title=Bumper-mounted lasers |publisher=Sciencedaily.com |date=2007-02-27 |access-date=2013-05-06}}</ref> In the event where the vehicle in front slows down or is too close, the ACC applies the brakes to slow the vehicle. When the road ahead is clear, the ACC allows the vehicle to accelerate to a speed preset by the driver. Refer to the Military section above for further examples. A lidar-based device, the [[Ceilometer]], is used at airports worldwide to measure the height of clouds on runway approach paths.<ref>{{Cite web|url=https://www.weather.gov/media/asos/ASOS%20Implementation/CL31%20Implementation%20Plan%20_FINAL_%2011-14-2008.pdf |archive-url=https://ghostarchive.org/archive/20221009/https://www.weather.gov/media/asos/ASOS%20Implementation/CL31%20Implementation%20Plan%20_FINAL_%2011-14-2008.pdf |archive-date=2022-10-09 |url-status=live|title=Automated Surface Observing System (ASOS) Implementation Plan|last=U.S. Department of Commerce|date=14 November 2008|website=weather.gov}}</ref> ===Wind farm optimization=== Lidar can be used to increase the energy output from [[wind farms]] by accurately measuring wind speeds and wind turbulence.<ref>Clive, P. J. M., [http://environmentalresearchweb.org/cws/article/opinion/36322 Windpower 2.0: technology rises to the challenge] {{Webarchive|url=https://web.archive.org/web/20140513010910/http://environmentalresearchweb.org/cws/article/opinion/36322 |date=2014-05-13 }} Environmental Research Web, 2008. Retrieved: 9 May 2014.</ref><ref name="RisoeHorns12">{{cite web|author=Mikkelsen, Torben|url=http://130.226.56.153/rispubl/reports/ris-r-1506.pdf|title=12MW Horns Rev Experiment|publisher=Risoe|date=October 2007|access-date=2010-04-25|display-authors=etal|archive-url=https://web.archive.org/web/20110703035643/http://130.226.56.153/rispubl/reports/ris-r-1506.pdf|archive-date=2011-07-03}}</ref> Experimental lidar systems<ref name="EconomistLidar">{{cite news|url=http://www.economist.com/science-technology/technology-quarterly/displaystory.cfm?story_id=15582251|title=Smarting from the wind|newspaper=The Economist|date=2010-03-04|access-date=2010-04-25}}</ref><ref name="University of Stuttgart">{{cite news|url=http://www.uni-stuttgart.de/hkom/presseservice/uni-infos/2012/wind.html|title=The world's first control of a wind turbine with a nacelle-based Lidar system|work=Corporate Communications University of Stuttgart|date=2012-06-05|access-date=2014-04-12}}</ref> can be mounted on the [[nacelle]]<ref name="NRELCART3">Andrew K. Scholbrock et al. [http://www.nrel.gov/docs/fy13osti/57339.pdf Field Testing LIDAR Based Feed-Forward Controls on the NREL Controls Advanced Research Turbine] ''National Renewable Energy Laboratory Data Base'', 12 April 2014. Retrieved: 12 April 2014.</ref> of a [[wind turbine]] or integrated into the rotating spinner<ref name="RisoeEWEC">Mikkelsen, Torben & Hansen, Kasper Hjorth et al. [http://orbit.dtu.dk/getResource?recordId=259451&objectId=1&versionId=1 Lidar wind speed measurements from a rotating spinner] ''Danish Research Database & Danish Technical University'', 20 April 2010. Retrieved: 25 April 2010.</ref> to measure oncoming horizontal winds,<ref>Asimakopolous, M., Clive, P. J. M., More, G., and Boddington, R., [http://www.sgurrenergy.com/wp-content/uploads/2012/12/Compression-zone-technical-paper-A4.pdf Offshore compression zone measurement and visualisation] {{webarchive|url=https://web.archive.org/web/20140512231016/http://www.sgurrenergy.com/wp-content/uploads/2012/12/Compression-zone-technical-paper-A4.pdf |date=2014-05-12 }} European Wind Energy Association Annual Conference, 2014. Retrieved: 9 May 2014.</ref> winds in the wake of the wind turbine,<ref>Gallacher, D., and More, G., [http://www.sgurrenergy.com/wp-content/uploads/2012/12/EWEA-2014-Poster-ID-175-Lidar-Measured-Wakes-B3.pdf Lidar measurements and visualisation of turbulence and wake decay length] {{webarchive|url=https://web.archive.org/web/20140512223921/http://www.sgurrenergy.com/wp-content/uploads/2012/12/EWEA-2014-Poster-ID-175-Lidar-Measured-Wakes-B3.pdf |date=2014-05-12 }} European Wind Energy Association Annual Conference, 2014. Retrieved: 9 May 2014.</ref> and proactively adjust blades to protect components and increase power. Lidar is also used to characterise the incident wind resource for comparison with wind turbine power production to verify the performance of the wind turbine<ref>Clive, P. J. M., et al., [http://www.sgurrenergy.com/wp-content/uploads/2012/12/Power-Curve-Technical-Paper-A3.pdf Offshore power curve tests for onshore costs: a real world case study] {{webarchive|url=https://web.archive.org/web/20140512225420/http://www.sgurrenergy.com/wp-content/uploads/2012/12/Power-Curve-Technical-Paper-A3.pdf |date=2014-05-12 }} European Wind Energy Association Annual Conference, 2014. Retrieved: 9 May 2014.</ref> by measuring the wind turbine's power curve.<ref>Clive, P. J. M., [http://www.sgurrenergy.com/wp-content/uploads/2013/07/Offshore-Power-Performance-Assessment-using-lidar1.pdf Offshore power performance assessment for onshore costs] {{webarchive|url=https://web.archive.org/web/20140417103501/http://www.sgurrenergy.com/wp-content/uploads/2013/07/Offshore-Power-Performance-Assessment-using-lidar1.pdf |date=2014-04-17 }} DEWEK (Deutsche Windenergie Konferenz), 2012. Retrieved: 9 May 2014.</ref> Wind farm optimization can be considered a topic in [[Differential absorption LIDAR|''applied eolics'']]. Another aspect of lidar in wind related industry is to use [[computational fluid dynamics]] over lidar-scanned surfaces in order to assess the wind potential,<ref>{{cite journal|last=Lukač|first=Niko|author2=Štumberger Gorazd |author3=Žalik Borut |title=Wind resource assessment using airborne LiDAR data and smoothed particle hydrodynamics|journal=Environmental Modelling & Software|year=2017|volume=95|pages=1–12|doi=10.1016/j.envsoft.2017.05.006|bibcode=2017EnvMS..95....1L }}</ref> which can be used for optimal wind farm placement. === Solar photovoltaic deployment optimization === Lidar can also be used to assist planners and developers in optimizing solar [[photovoltaic]] systems at the city level by determining appropriate roof tops <ref>{{cite journal|last=Jochem|first=Andreas|author2=Höfle Bernhard |author3=Rutzinger Martin |author4=Pfeifer Norbert |title=Automatic roof plane detection and analysis in airborne lidar point clouds for solar potential assessment|journal=Sensors|year=2009|volume=9|issue=7|pages=5241–5262|doi=10.3390/s90705241|pmid=22346695|pmc=3274168|bibcode=2009Senso...9.5241J|doi-access=free}}</ref><ref>{{cite journal | last1 = Nguyen | first1 = Ha T. | last2 = Pearce | first2 = Joshua M. | last3 = Harrap | first3 = Rob | last4 = Barber | first4 = Gerald | year = 2012 | title = The Application of LiDAR to Assessment of Rooftop Solar Photovoltaic Deployment Potential on a Municipal District Unit | journal = Sensors | volume = 12 | issue = 4| pages = 4534–4558 | doi=10.3390/s120404534| pmid = 22666044 | pmc = 3355426 | bibcode = 2012Senso..12.4534N | doi-access = free }}</ref> and for determining [[shading]] losses.<ref>{{cite journal|last=Nguyen|first=Ha T.|author2=Pearce, Joshua M. |title=Incorporating shading losses in solar photovoltaic potential assessment at the municipal scale|journal=Solar Energy|year=2012|volume=86|issue=5|pages=1245–1260|doi=10.1016/j.solener.2012.01.017|url=http://hal.archives-ouvertes.fr/hal-00685775|bibcode = 2012SoEn...86.1245N |s2cid=15435496}}</ref> Recent airborne laser scanning efforts have focused on ways to estimate the amount of solar light hitting vertical building facades,<ref>{{cite journal|last=Jochem|first=Andreas|author2=Höfle Bernhard |author3=Rutzinger Martin |title=Extraction of vertical walls from mobile laser scanning data for solar potential assessment|journal=Remote Sensing|year=2011|volume=3|issue=4|pages=650–667|doi=10.3390/rs3030650|bibcode = 2011RemS....3..650J |doi-access=free}}</ref> or by incorporating more detailed shading losses by considering the influence from vegetation and larger surrounding terrain.<ref>{{cite journal|last=Lukač|first=Niko|author2=Žlaus Danijel |author3=Seme Seme |author4=Žalik Borut |author5=Štumberger Gorazd |title=Rating of roofs' surfaces regarding their solar potential and suitability for PV systems, based on LiDAR data|journal=Applied Energy|year=2013|volume=102|pages=803–812|doi=10.1016/j.apenergy.2012.08.042|bibcode=2013ApEn..102..803L }}</ref> ===Video games=== Recent simulation racing games such as ''[[rFactor Pro]]'', ''[[iRacing]]'', ''[[Assetto Corsa]]'' and ''[[Project CARS]]'' increasingly feature race tracks reproduced from 3-D point clouds acquired through lidar surveys, resulting in surfaces replicated with centimeter or millimeter precision in the in-game 3-D environment.<ref>{{Cite web|date=2011-07-15|title=rFactor Pro - Laser Scanning Heaven|url=https://www.virtualr.net/rfactor-pro-laser-scanning-heaven|access-date=2020-06-04|website=VirtualR.net - 100% Independent Sim Racing News}}</ref><ref>{{Cite web|last=Marsh|first=William|date=2017-06-30|title=rFactor Pro Shows off LIDAR Scanned Hungaroring|url=https://simracingpaddock.com/rfactor2/rfactor-pro-shows-off-lidar-scanned-hungaroring/|access-date=2020-06-04|website=Sim Racing Paddock}}</ref><ref>{{Cite web|date=2017-02-08|title=New Tracks in Project CARS 2 Were Laser-Scanned With Drones|url=https://www.gtplanet.net/new-tracks-in-project-cars-2-were-laser-scanned-with-drones/|access-date=2020-06-04|website=GTPlanet}}</ref> The 2017 exploration game ''[[Scanner Sombre]]'', by [[Introversion Software]], uses lidar as a fundamental game mechanic. In ''[[Build the Earth]]'', lidar is used to create accurate renders of terrain in ''[[Minecraft]]'' to account for any errors (mainly regarding elevation) in the default generation. The process of rendering terrain into Build the Earth is limited by the amount of data available in region as well as the speed it takes to convert the file into block data. ===Other uses=== [[File:LiDAR_Scanner_and_Back_Camera_of_iPad_Pro_2020_-_4.jpg|thumb|Lidar scanner on a [[iPad Pro (4th generation)|4th generation iPad Pro]]]] The video for the 2007 song "[[House of Cards (Radiohead song)|House of Cards]]" by [[Radiohead]] was believed to be the first use of real-time 3-D laser scanning to record a music video. The range data in the video is not completely from a lidar, as structured light scanning is also used.<ref>{{cite web|url=http://creativity-online.com/?action=news:article&newsId=129514§ionId=behind_the_work|author=Nick Parish|title=From OK Computer to Roll computer: Radiohead and director James Frost make a video without cameras|publisher=Creativity|date=2008-07-13|archive-url=https://web.archive.org/web/20080717184531/http://creativity-online.com/?action=news:article&newsId=129514§ionId=behind_the_work|archive-date=2008-07-17}}</ref> In 2020, [[Apple Inc.|Apple]] introduced the [[iPad Pro (4th generation)|fourth generation of iPad Pro]] with a lidar sensor integrated into the rear [[camera module]], especially developed for [[augmented reality]] (AR) experiences.<ref>{{cite web |title=LiDAR vs. 3D ToF Sensors – How Apple Is Making AR Better for Smartphones |date=31 March 2020 |url=https://ios.gadgethacks.com/news/lidar-vs-3d-tof-sensors-apple-is-making-ar-better-for-smartphones-0280778/ |access-date=2020-04-03}}</ref> The feature was later included in the [[iPhone 12 Pro|iPhone 12 Pro lineup]] and subsequent Pro models.<ref>{{cite web |title= Apple launches iPhone 12 Pro line with new design, better cameras, LiDAR | date=13 October 2020| url=https://appleinsider.com/articles/20/10/13/apple-launches-iphone-12-pro-line-with-new-design-better-cameras/|access-date=2020-10-14}}</ref> On Apple devices, lidar empowers portrait mode pictures with night mode, quickens [[Autofocus|auto focus]] and improves accuracy in the [[Pre-installed iOS apps#Measure|Measure]] app. In 2022, ''[[Wheel of Fortune (American game show)|Wheel of Fortune]]'' started using lidar technology to track when [[Vanna White]] moves her hand over the puzzle board to reveal letters. The first episode to have this technology was in the season 40 premiere.<ref>{{Cite web|url=https://www.tvinsider.com/1060488/wheel-of-fortune-new-puzzle-board-sparks-mixed-reaction-from-fans/|title='Wheel of Fortune' New Puzzle Board Sparks Mixed Reaction From Fans|work=TV Insider|first=Martin|last=Holmes|date=13 September 2022 }}</ref> == Variants == In flash lidar, the entire field of view is illuminated with a wide [[beam divergence|diverging]] laser beam in a single pulse. This is in contrast to conventional scanning lidar, which uses a [[collimated beam|collimated laser beam]] that illuminates a single point at a time, and the beam is [[raster scan]]ned to illuminate the field of view point-by-point. This illumination method requires a different detection scheme as well. In both scanning and flash lidar, a [[time-of-flight camera]] is used to collect information about both the 3-D location and intensity of the light incident on it in every frame. However, in scanning lidar, this camera contains only a point sensor, while in flash lidar, the camera contains either a 1-D or a 2-D [[sensor array]], each pixel of which collects 3-D location and intensity information. In both cases, the depth information is collected using the [[time of flight]] of the laser pulse (i.e., the time it takes each laser pulse to hit the target and return to the sensor), which requires the pulsing of the laser and acquisition by the camera to be synchronized.<ref name=":12">{{Cite web|title=Advanced Scientific Concepts Inc|url=http://www.advancedscientificconcepts.com/technology/technology.html|access-date=2019-03-08|website=advancedscientificconcepts.com}}</ref> The result is a camera that takes pictures of distance, instead of colors.<ref name=":11" /> Flash lidar is especially advantageous, when compared to scanning lidar, when the camera, scene, or both are moving, since the entire scene is illuminated at the same time. With scanning lidar, motion can cause "jitter" from the lapse in time as the laser rasters over the scene. As with all forms of lidar, the onboard source of illumination makes flash lidar an active sensor. The signal that is returned is processed by embedded algorithms to produce a nearly instantaneous 3-D rendering of objects and terrain features within the field of view of the sensor.<ref>{{Cite web|title=Patent Details|url=https://technology.nasa.gov/patent/LAR-TOPS-168|access-date=2019-03-08|website=technology.nasa.gov}}</ref> The laser pulse repetition frequency is sufficient for generating 3-D videos with high resolution and accuracy.<ref name=":12" /><ref>{{Cite web|title=Analog to Digital Conversion: Sampling|url=https://www.cl.cam.ac.uk/~jac22/books/mm/book/node96.html|access-date=2019-03-08 |website=cl.cam.ac.uk}}</ref> The high frame rate of the sensor makes it a useful tool for a variety of applications that benefit from real-time visualization, such as highly precise remote landing operations.<ref>{{Cite conference |date=2019-05-07|title=Imaging Flash Lidar for Autonomous Safe Landing and Spacecraft Proximity Operation |conference=AIAA Space 2016 Conference |website=NASA Technical Reports Server |url=https://ntrs.nasa.gov/citations/20160011575}}</ref> By immediately returning a 3-D elevation mesh of target landscapes, a flash sensor can be used to identify optimal landing zones in autonomous spacecraft landing scenarios.<ref>Dietrich, Ann Brown, "Supporting Autonomous Navigation with Flash Lidar Images in Proximity to Small Celestial Bodies" (2017). ''CU Boulder'' ''Aerospace Engineering Sciences Graduate Theses & Dissertations''. 178.</ref> Seeing at a distance requires a powerful burst of light. The power is limited to levels that do not damage human retinas. Wavelengths must not affect human eyes. However, low-cost silicon imagers do not read light in the eye-safe spectrum. Instead, [[Gallium arsenide|gallium-arsenide]] imagers are required, which can boost costs to $200,000.<ref name=":11" /> Gallium-arsenide is the same compound used to produce high-cost, high-efficiency solar panels usually used in space applications. ==Alternative technologies== {{further|3-D scanner}} [[Computer stereo vision]] has shown promise as an alternative to lidar for close range applications.<ref>{{cite arXiv|last1=Wang|first1=Yan|last2=Chao|first2=Wei-Lun|last3=Garg|first3=Divyansh|last4=Hariharan|first4=Bharath|last5=Campbell|first5=Mark|last6=Weinberger|first6=Kilian Q.|date=2020-02-22|title=Pseudo-LiDAR from Visual Depth Estimation: Bridging the Gap in 3D Object Detection for Autonomous Driving|class=cs.CV|eprint=1812.07179}}</ref> ==See also== *{{annotated link|Geological structure measurement by LiDAR}} *{{annotated link|Lidar detector}} *{{annotated link|Photogrammetry}} *{{annotated link|Range imaging}} *{{annotated link|Time-domain reflectometry}} ==References== {{Reflist}} ==Further reading== * {{cite journal |doi=10.3390/s130100516|doi-access=free |title=Use of a Terrestrial LIDAR Sensor for Drift Detection in Vineyard Spraying |date=2013 |last1=Gil |first1=Emilio |last2=Llorens |first2=Jordi |last3=Llop |first3=Jordi |last4=Fàbregas |first4=Xavier |last5=Gallart |first5=Montserrat |journal=Sensors |volume=13 |issue=1 |pages=516–534 |pmid=23282583 |pmc=3574688 |bibcode=2013Senso..13..516G }} *Heritage, E. (2011). 3D laser scanning for heritage. Advice and guidance to users on laser scanning in archaeology and architecture. Available at www.english-heritage.org.uk. [https://historicengland.org.uk/images-books/publications/3d-laser-scanning-heritage2/ 3D Laser Scanning for Heritage {{!}} Historic England]. * Heritage, G., & Large, A. (Eds.). (2009). Laser scanning for the environmental sciences. John Wiley & Sons. {{ISBN|1-4051-5717-8}}. * Maltamo, M., Næsset, E., & Vauhkonen, J. (2014). Forestry Applications of Airborne Laser Scanning: Concepts and Case Studies (Vol. 27). Springer Science & Business Media. {{ISBN|94-017-8662-3}}. * Shan, J., & Toth, C. K. (Eds.). (2008). Topographic laser ranging and scanning: principles and processing. CRC press. {{ISBN|1-4200-5142-3}}. * Vosselman, G., & Maas, H. G. (Eds.). (2010). Airborne and terrestrial laser scanning. Whittles Publishing. {{ISBN|1-4398-2798-2}}. ==External links== {{Commons category|LIDAR}} *{{cite web |author=National Oceanic and Atmospheric Administration (NOAA)| title=What is LIDAR? | website=NOAA's National Ocean Service | date=15 April 2020 | url=https://oceanservice.noaa.gov/facts/lidar.html }} *[https://web.archive.org/web/20160219045753/http://lidar.cr.usgs.gov/ The USGS Center for LIDAR Information Coordination and Knowledge (CLICK)] – A website intended to "facilitate data access, user coordination and education of lidar remote sensing for scientific needs." {{Meteorological equipment}} {{Authority control}} [[Category:Lidar| ]] [[Category:Meteorological instrumentation and equipment]] [[Category:Robotic sensing]] [[Category:Articles containing video clips]]
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Templates used on this page:
Template:Annotated link
(
edit
)
Template:Authority control
(
edit
)
Template:Citation
(
edit
)
Template:Citation needed
(
edit
)
Template:Cite arXiv
(
edit
)
Template:Cite bioRxiv
(
edit
)
Template:Cite book
(
edit
)
Template:Cite conference
(
edit
)
Template:Cite journal
(
edit
)
Template:Cite magazine
(
edit
)
Template:Cite news
(
edit
)
Template:Cite patent
(
edit
)
Template:Cite web
(
edit
)
Template:Commons category
(
edit
)
Template:Convert
(
edit
)
Template:Cvt
(
edit
)
Template:Doi
(
edit
)
Template:Further
(
edit
)
Template:IPAc-en
(
edit
)
Template:ISBN
(
edit
)
Template:Main
(
edit
)
Template:Meteorological equipment
(
edit
)
Template:More citations needed section
(
edit
)
Template:Multiple image
(
edit
)
Template:Other uses
(
edit
)
Template:PD-notice
(
edit
)
Template:Reflist
(
edit
)
Template:See also
(
edit
)
Template:Short description
(
edit
)
Template:Snd
(
edit
)
Template:Webarchive
(
edit
)
Search
Search
Editing
Lidar
Add topic