The latest autonomous driver-assist systems now navigate busy city streets, responding to traffic lights, pedestrians, unexpected road closures and erratic driving by other cars. The Level 4 prototypes of today can make U-turns, steer themselves along with other cars on road surfaces with no lane markings, respond to hard-to-see traffic lights and make decisions when there are no traffic lights at all.
Indeed, there is a tremendous gap between the Level 2 cars of yesteryear — Tesla’s Autopilot driver-assist system is technically a Level 2 — and the decision-making requirements of Level 4 cars in a dense urban setting with the complexities of stop-and-go traffic, convoluted and contradictory traffic signs, and bicyclists and pedestrians sharing the road space. The computing power alone required to process it all and make decisions on the fly isn’t something many industry observers predicted being operational in 2018 — even in prototype form.
To see just how much progress autonomous driving systems have made, have a look at Torc Robotics’ Asimov Level 4 self-driving system in the video at the top. Torc uses LIDAR, cameras and radar, and is among the systems closest to being worthy of the once-shunned “self-driving” label in development today.
Still, there are plenty of traffic situations we can think of that would require action by the driver that some autonomous systems would have difficulty addressing, at least for now.
Is the best software in prototype stages today advanced enough to pull over when an emergency vehicle approaches from behind or in the opposite direction? Developers are working on systems that can detect the approach of emergency vehicles based on sound. Or how about a police officer stopped in the opposite lane, facing the opposite direction and holding his left arm out the window of his car to direct traffic in your lane to stop? Or a school bus that pulls over in the opposite lane or a couple lanes over to your right in the same direction of traffic as your car. And what about that not-uncommon situation where the car in front of yours stops at a red light and then backs up just a little, and you notice that the driver never engages the forward gear in anticipation of a green light. You would probably tap the horn a couple times, in the hope that the driver ahead of you gets the point before they mash the gas pedal in reverse gear as soon as the light turns green.
Autonomous vehicles are coming. A couple of months ago we went for a spin in a Freightliner 18-wheeler featuring the Highway Pilot system, an autopilot of sorts that lets the driver take his …
And then there’s the situation where another car flashes its high beams, which can mean a lot of things depending on the context: a signal to proceed, a signal to move over, a warning about a road hazard or a warning about a police officer up ahead measuring speed.
You probably have to deal with at least a couple of these nuanced situations at least once a week, not counting maneuvering in parking lots and queuing up at gas stations, and there are still plenty of situations in which LIDAR, radar and cameras may not be enough to detect signals or unusual hazards.
The latest autonomous driving systems are on their way to solving these complex but everyday traffic issues that might not stump a human but can be difficult for machines to comprehend and negotiate quickly and safely. In the video at the top, which Torc posted ahead the Consumer Electronics Show — during which it is demonstrating its systems in real-world Las Vegas traffic — you probably noticed that the Asimov software can already respond to unsure pedestrians who may look like they want to cross the street but then retreat back. These are the types of situations which the latest crop of autonomous vehicles is learning to solve and predicting the actions of unpredictable pedestrians is one of the most important tests that truly self-driving cars will need to solve daily.