Autonomous Vehicles have long been viewed as the logical next monumental breakthrough in engineering. A fantastical feat that has been depicted throughout Hollywood and analyzed by many journals; autonomous vehicles are one of the most highly scrutinized potential breakthroughs of this decade. There are 6 levels of autonomy which represent a progressive pathway to level 5 – full autonomy. So, this begs the questions: How long until we reach complete autonomy? What level of autonomy are we at now? And how is it accomplished? First, let’s get a clear picture of what each level of autonomy entails.
Light detection and ranging (LiDAR) sensors are a commonly used source of optical information for remote sensing payloads in scanning and surveying applications. LiDAR payloads use penetrating pulses of laser light that are reflected and used to determine relative distance to the point that it reflected from. When these pulses backscatter (reflect at an angle of 180 degrees), many payloads will use inertial navigation systems (INS) to timestamp and georeference data points that are acquired. Compiled together, these individual data points paired with point cloud software streamline the process for analyzing structures and ground planes.
Sensor fusion plays a large role in any device that is attempting to produce estimated, quantifiable data. Sensor fusion is the ability to bring together inputs from multiple sensors to produce a single model whose result is more accurate than that of the individual inputs alone. There are three fundamental methods of sensor fusion:
- Redundant Sensors- All sensors give the same information for the environment.
- Complementary Sensors- The sensors provide independent, disjointed information about its environment.
- Coordinated Sensors- The sensors collect information about its environment sequentially.
From there, the information is communicated in one of three different ways. In a centralized setup, all sensors provide information to a common central node. If the configuration is decentralized, no information is communicated between the sensors and the nodes. If it is a distributed organization, then the nodes interchange sensor information at a given rate.
Evolution of Remote Sensing
Remote sensing emerged more than 150 years ago, as balloonists took pictures of the ground using the recently invented photo cameras in the 1840s. Perhaps the most memorable breakthrough in the field at the end of the 19th century was the famous fleet of pigeons that operated as innovation in Europe, taking pictures with cameras attached to their bodies. By the First World War, cameras mounted on airplanes provided aerial views of vast surface areas that proved to be invaluable in military reconnaissance. The aerial photograph remained the single standard tool for imaging the surface up until the early 1960s.