Open side-bar Menu
 GISCafe Voice
Susan Smith
Susan Smith
Susan Smith has worked as an editor and writer in the technology industry for over 16 years. As an editor she has been responsible for the launch of a number of technology trade publications, both in print and online. Currently, Susan is the Editor of GISCafe and AECCafe, as well as those sites’ … More »

Vista LiDAR Sensor Unveiled for Autonomous Vehicles

 
April 26th, 2018 by Susan Smith

Cepton Technologies, Inc., a provider of 3D LiDAR solutions for automotive, industrial and mapping applications, recently introduced its Vista LiDAR sensor at the annual NVIDIA GPU Technology Conference, making it immediately available for the autonomous vehicle market.

Vista on the car

The 120-line-equivalent scanner delivers 200 meters of range and 0.2 degrees of spatial resolution. With a much smaller footprint than most solutions on the market, the Vista LiDAR also uses fewer than 10 watts of power. This allows automakers to seamlessly integrate LiDAR technology into the vehicle body.

The Vista sensor is the fourth LiDAR product developed by the Cepton team over its first 20 months in operation. Built on Cepton’s patented micro-motion technology (MMT) platform, Vista has no rotational or frictional parts, consisting only of high maturity automotive components for expedited automotive grade certification. This would suggest the sensor is less likely to wear out or break down.

In the following interview with  Wei Wei, Senior Business Development at Cepton, GISCafe Voice finds out about the inner workings of this exciting new technology for autonomous vehicles.

A video overview is available here: https://vimeo.com/261965022

Are you using proprietary GIS or other in the building of the product?

Cepton is not an expert in GIS realm. The general definition of GIS, is a framework for gathering, managing and analyzing data. Cepton is making our LiDAR system with a component of the mapping system for gathering data, to produce digital maps. Cepton’s customers, e.g. LiDAR USA will integrate our LiDAR into their UAV mapping system, offering users the ability to collect data and post-process the data into 3D maps.

Vista 1

How does the deep learning and sensor fusion come together?

Sensor fusion is about bringing information from multiple sensors together. It can be sensors of the same kind, such as cameras, or multiple sensors operating multi-spectral, such as LiDAR, radar and camera. In a centralized sensor fusion architecture, the raw data from all sensors gets fused together, in which developers will use different methods to make use of their sensor data to perform perception, localization and other tasks. In these approaches, using neutral networks to perform deep learning became a popular approach recently. For certain scenarios and applications, deep learning may not be the only approach. The robotics approach is still employed by the autonomous vehicle and robotics communities.

What will the customer experience?

The customer experience for automated vehicles varies, depending on the level of autonomy. Having a high-performance LiDAR is crucial to enable Level 3 and above. A typical Level 3 feature being planned by many OEMs is called highway pilot — meaning the vehicles can drive itself from toll to toll without human intervention. However, for other complex scenarios humans are still required to take over control. For level 4, the vehicles will not have a steering wheel. Some examples can be found in the new vehicles from Waymo, GM and Cruise. In the future, civilians can use an Uber-like app to call these vehicles, and they will come pick them up, without a driver.

Since safety is a primary concern with autonomous driving, what aspects are taken into consideration in the creation of sensors?

As of today, cameras and radars are more mature than LiDAR. For newer sensors such as LiDAR, there are a few fundamental considerations when creating the LiDAR for autonomous driving. Putting performance and cost on the side, the reliability and scalability are very important for safety. The reliability comes from the maturity of each component inside the sensor. Are they massively produced today? Do they qualify for automotive grade? Does the sensor depend on research grade components, such as MEMs mirrors, SPAD, OPAs and impose risks in performance stability under different situations? How does the system work under extreme temperature, shock and vibrations? On a system level, ASIL-B is required for LiDAR to ensure the system level functional safety. Are failure modes comprehensively analyzed and how should the sensor report to the upper level system? High volume manufacturability and consistency should be incorporated into the sensor design from day one. Making a sensor prototype function is very different from putting them into production vehicles, as the mission of the LiDAR sensor is to save lives.

How do you think the sensors for self-driving can be improved?

Advanced sensing is very popular and is under rapid development. There are many companies working on a variety of sensors – LiDARs, radars, cameras and thermal sensors. Ultimately, the evolution of the sensor depends on component level innovations. In 10 to 20 years, the sensors might look vastly different from the one we have today. However, OEMs and consumers cannot wait that long. It’s really a balance of finding the right approach, which leverages existing materials and components into making a viable solution. This pushes the current sensors to longer distances, higher resolutions and lower costs.

Who are your customers? 

Our customer base is made up of automotive, industrial and mapping markets. For autos, our customers are auto OEMs, tier suppliers and autonomous driving companies who deploy robo-taxis. As of right now, we are not ready to disclose large auto maker, or tier 1 company names. May Mobility is our partner in the autonomous driving space. For the industrial market, the customers are automation system integrators — like our partner ClearPath Robotics. ClearPath develops autonomous robots for indoor and outdoor research tasks. For mapping, our partner LiDAR USA, is leading the UAV mapping space. We have at least one customer who has a leading position in each market we operate, which helps us receive feedback and requires us to build a more generic LiDAR to serve multiple applications.

How does the AI computer comprehend what is around the vehicle? 

The AI computer is the brain, meaning it alone cannot comprehend the vehicle. The brain needs eyes to see what’s happening around the vehicle. This goes back to the sensor fusion question. The vehicle installed with LiDAR, radar and camera, absorbs multispectral information from the environment around the vehicle and performs perception tasks to make sense of the environment – objects are being identified and classified, movements are being tracked and infrastructures are being mapped out.

What are the dimensions of the sensor? 

The LiDAR sensor as of now, is quite small. The newest Cepton LiDAR cross section is the size of a credit card and a few centimeters deep. In the future, Cepton strides to make the size even smaller, so it can be integrated into the headlamps and rear lamps. 

Where does the sensor reside in the vehicle?        

Currently, most self-driving vehicles have spinning LiDARs on top of the car. This is only for research and development purposes. For mass-production vehicles, consumers will not accept such sensor configuration. Most auto makers are working on a seamless integration of LiDAR sensors into the body of the vehicle. There are several possible options – behind the grille, facias, inside the lamps or behind the windshield etc. The idea is, as sensors becomes smaller and power consumption gets lower, it can easily fit into the body of the vehicle.

Are there adjustments a customer might make or is the sensor self contained?

The LiDAR sensor has everything it needs – lasers, detectors, optics and the processing chip. However, in the automotive industry, the design and styling are very important. The form factor of the LiDAR will be heavily influenced by the design and sensor suppliers. Like Cepton, others will need to work with OEMs to make performance and design trade-offs to ensure the seamless integration and not compromise the safety standards.

Tags: , , , , , , , , , , , , , , ,

Categories: 3D Cities, analytics, asset management, autonomous driving, Big Data, cloud, data, disaster relief, drones, emergency response, geospatial, GIS, Google, Google Maps Engine, government, image-delivery software, indoor location technology, lidar, location based sensor fusion, location based services, location intelligence, mapping, mobile, resilient cities, sensors, spatial data, telecommunications, transportation, UAVs, utilities

Logged in as . Log out »




© 2024 Internet Business Systems, Inc.
670 Aberdeen Way, Milpitas, CA 95035
+1 (408) 882-6554 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise