ADAS: Key Trends on ‘Perception’ – EE Times Asia

ADAS: Key Trends on ‘Perception’ – EE Times Asia:

Cruise’ latest test vehicle that started to roll off the GM production line is loaded with sensors, as indicated here in red. (Source: Cruise)

Phil Magney, founder and principal at VSI Labs, predicted that “Level 4 [autonomous vehicles] will be rolled out within a highly restricted operational design domain, [and built on] very comprehensive and thorough safety case.” By “a highly restricted ODD,” Magney said, “I mean specific road, specific lane, specific operation hours, specific weather conditions, specific times of day, specific pick-up and drop-off points, etc.”

Asked if the AI-driven car will ever attain “the common sense understanding” — knowing that it is actually driving and understanding its context, Bart Selman, a computer science professor at Cornell University who specializes in AI, said at the conference’s closing panel:

AI-driven car will attain “the common sense understanding” — knowing that it is actually driving and understanding its context

“…at least 10 years away…, it could be 20 to 30 years away.“

Adding more smarts to the edge was a new trend emerging at the conference. Many vendors are adding more intelligence at the sensory node, by fusing different sensory data (RGB camera + NIR; RGB + SWIR; RGB + lidar; RGB + radar) right on the edge.

However, opinions among industry players appear split on how to accomplish that. There are those who promote sensor fusion on the edge, while others, such as Waymo, prefer a central fusion of raw sensory data on a central processing unit.

The need to see in the dark — whether inside or outside a vehicle — indicates the use of IR. While On Semiconductor’s RGB-IR image sensor uses NIR (near infrared) technology, Trieye, which also came to the show, went a step further by showing off a SWIR (short wave-based infrared) camera.

Trieye claims it has found a way to design SWIR by using CMOS process technology. “That’s the breakthrough we made. Just like semiconductors, we are using CMOS for the high-volume manufacturing of SWIR cameras from Day 1,” said Avi Bakal, CEO and co-founder of Trieye. Compared to InGaAs sensor that costs more than $8,000, Bakal said that a Trieye camera will be offered “at tens of dollars.”

David Tokic, vice president of marketing and strategic partnerships at Algolux, told EE Times that automotive engineers working on ADAS and AVs are concerned about two things: 1) Robust perception under all conditions and 2) Accurate and scalable vision models.

Typical camera systems deployed in ADAS or AV today are all different, with large variabilities. Depending on their lenses (different lenses provide different fields of view), sensors, and image signal processing, parameters all differ. A technology company picks one camera system, collects a large data set, annotates it and trains it in order to build an accurate neural network model tuned to the system.

But what happens when an OEM replaces the camera originally used for training data?  This change could affect perception accuracy because the neural network model — tuned to the original camera — is now coping with a new set of raw data.

Does this require an OEM to train its data set all over again?

Asked about the swap-ability of image sensors, VSI Labs’ Magney said, “I don’t think this is an option unless the specs are the same.” He noted, “For example at VSI, we trained our own Neural Network for the FLIR thermal camera and those training images were collected with the same specs as the camera that we deployed on. We did swap sensors later, but the specs were the same.”

Algolux, however, claims a new methodology to translate such previously created data sets “in a matter of days.” Tokic said the company’s Atlas Camera Optimization Suite achieves this by knowing “priors” — characteristics of a camera and a sensor — and applying them to detection layers. “Our mission here is to democratize a choice of cameras” for OEMs, said Tokic.

Targeting this newly emerging AI accelerator market, Ceva, for one, unveiled at the AutoSens conference the company’s new AI core and ‘Invite API.

On Semiconductor’s RGB + IR camera announcement at AutoSens, on the other hand, revealed that the On Semi/Eyeris team has picked Ambarella’s SoC as its AI processor for in-vehicle monitoring tasks.

Acknowledging that Ambarella is not generally known as an AI accelerator outfit (it is, rather, a traditional video compression and computer vision chip company), Modar Alaoui, CEO of Eyeris, said, “We couldn’t find any AI chips that can support 10 neural networks, consume less than 5 watt and capture 30 frames per second video by using up to six cameras — all looking inside a vehicle” to run Eyeris’ AI in-vehicle monitoring algorithms. But Ambarella’s CV2AQ SoC fit the bill, he said, beating all the other much-hyped accelerators.

Seeing Machines noted that it has also developed its own hardware — Fovio driver monitoring chip. Asked if the chip can also serve future in-vehicle monitoring systems, Emmerich explained that the IP of its chip will be applied to a configurable hardware platform.

Redundancy
Combining additional sensors of different modalities and installing them in a vehicle is necessary not only for improving perception but also adding much needed redundancy for safety.

Outsight, a startup co-founded by Cedric Hutchings, a former Whitings CEO, was at AutoSens pitching a new highly integrated box consisting of several sensors. He explained that Outsight’s sensor fusion box was designed to “offer perception with comprehension, and localization with understanding of the full environment — including snow, ice and oil on the road.” He added, “We can even classify materials on the road by using active hyperspectral sensing.”

EE Times’ subsequent discussion with Trieye revealed that Outsight will be using Trieye’s SWIR camera. Outsight is promoting its sensor fusion box, scheduled for sampling in the first quarter of 2020, to tier ones and OEMs as an added standalone system that offers “uncorrelated data” for safety and “true redundancy,” Hutchings explained.

The box uses “no machine learning,” offering instead deterministic results to make it “certifiable.”

Aeye also pitched its iDAR, a solid-state MEMS lidar fused with an HD camera, for the ADAS/AV market. By combining two sensors and embedding AI, the system — operating in real time — can “address certain corner cases,” said Aravind Ratnam, AEye’s vice president of product management.

The iDAR system is designed to combine 2D camera “pixels” (RGB) and 3D lidar’s data “voxels” (XYZ) to provide a new real-time sensor data type that delivers more accurate, longer range and more intelligent information faster to an AV’s path-planning system, the company explained.

AEye’s AE110 Product Features Compared with Industry Benchmarks and Capabilities (Source: AEye)

In his presentation, Ratnam said AEye studied a variety of uses cases. “We looked at 300 scenarios, picked 56 applicable cases and narrowed them down to 20 scenarios” where the fusion of camera, lidar and AI makes sense.

Ratnam showed a scene in which a small child, out of nowhere, chases a ball into the street – right in front of a vehicle. The camera-lidar fusion at the edge works much faster, reducing the vehicle’s reaction time. He noted, “Our iDAR platform can provide very fast calculated velocity.”

Asked about the advantage of sensor fusion at the edge, a Waymo engineer at the conference told EE Times that he is uncertain if it would make a substantial difference. He asked, “Is it a difference of microseconds? I am not sure.”

AEye is confident of the added value its iDAR can offer to tier ones. By closely working with Hella and LG as key partners, AEye’s Ratnam stressed, “We have been able to drive down the cost of our iDAR. We are now offering a 3D lidar at ADAS prices.”

AEye is finishing the combined RGB and lidar system, automotive grade, embedded with AI in the next three to six months. The price will be “less than $1,000,” said Ratnam.

https://autonomous-driving-news.tumblr.com/post/188082374934

Leave a Reply