Self-driving car technology advances rapidly, but critics frequently point out that some hard problems remain. John Leonard, who headed MIT’s self-driving cars project at the DARPA 2007 urban challenge, eloquently describes various challenging situations including hand-waving police officers and left turns in heavy traffic.
The hand-waving police officer problem can be solved easily with a simple workaround: The car just detects the hand waving situation. It then dispatches a camera feed to a remote control center and asks a remote human operator for guidance (similar to Google’s patent 8996224).
The left turn problem is more interesting. Such situations occur more frequently and they do present significant challenges. Self-driving car prototypes have been known to wait for long intervals at intersections before they finally made the left turn – heavily testing the patience of human drivers stuck behind them. The video by John Leonard clearly shows how hard it can be to make a left turn when traffic is heavy in all directions and slots between cars coming from the left and the right are small and rare.
How do human drivers handle such situations? First they wait and observe the traffic patterns. If opportunities for left turns are rare they adjust their driving strategy. They may accelerate faster and will try to inch into a smaller slot than usual. Sometimes they will move slightly into the lane of cars coming from the left to signal that they are intent on making a turn and expect other cars to make room or they will try to find an intermediate spot between the various lanes and break the left turn down into one move towards this spot, come to a stop there and then into a second move from the the intermediate position into the the target lane. Leonard is right that programming such maneuvers into self-driving cars presents a major challenge.
But the problem is more fundamental. When we develop self-driving cars, we gain insights about the domain of driving and extend our knowledge not only about algorithms but also about human driving. To make a left turn, self driving cars have to analyze the traffic situation at the intersection. They are much better than humans at simultaneously identifying the traffic participants in all directions, to detect their speed, and they are quite good at anticipating their trajectories. Current driverless car prototypes also have no problems to decide on an appropriate path for the left turn. When a self-driving car hesitates at an intersection, the reason is not a problem with the algorithm but rather that the self-driving car finds that the safety margins for executing the turn are too small in the current situation: the risk is too high. Unfortunately, this problem can not be solved through better algorithms but only by increasing the level of acceptable risk! The risk of a left turn at an intersection is determined by the layout of the intersection, physics and the range of potential behavior of the other traffic participants, none of which can be changed by the self-driving car.
Left turns are indeed known to be risky. We may not think about it when we make a left turn, but accident statistics paint a very clear picture. An NHTSA study that analyzed crossing path crashes found that police-reported crashes involving left turns (913,000) are almost 10 times as frequent than police-reported crashes of right turns (99,000). If we consider that right and left turns are not equally distributed in normal driving (right turns occur more frequently but exact data are not available) then the risk of a left turn may be between 10 and 20 times larger than the risk of a right turn. In 2013 crashes between a motorcycle and another vehicle making a left turn cost 922 lives; this amounted to nearly half (42%) of all fatalities due to crashes involving a motorcycle and another vehicle. Arbella insurance reported that in 2013 31% of its severe accident claims involved left turns. Thus human drivers have little reason to be proud of their left-turn capabilities.
As a consequence, UPS has largely eliminated left turns many years ago. Recently the route planning app Waze has rolled out a new feature that allows users to plan routes without left turns. These two examples show that self-driving cars do not even need the capability of making left turns in heavy traffic. It is possible to get along without such turns.
Thus the left turn problem for self-driving cars leads to the following three insights:
1) The left turn problem is not so much a problem of self-driving cars, it really is a problem of human drivers who take too many risks at left turns as we can see from the large number of left-turn accidents and from the risk analysis which self-driving cars carefully perform when making a left turn. Autonomous cars should never make left turns as readily and rapidly as human drivers. As human drivers we need to be more self-critical about our own capabilities and more ready to question our assumptions about driving instead of using our driving behavior as our implicit standard for self-driving cars.
2) We need to carefully consider the acceptable risk profiles for self driving vehicles. Risk profiles are not black and white; there are more alternatives than the high levels of risk that we take every day as human drivers without much thinking and the minimal risk strategies adopted by all current self-driving car prototypes. It would be unacceptable to let self-driving cars barge into dense traffic in the same way as we sometimes consider viable and mostly get away with. But it would be possible to reduce the time that a driverless car has to wait when turning or merging by allowing the car to increase the acceptable risk by a small amount if clearly defined conditions are met.In this area, much work and thinking is required. Expecting a self-driving car to minimize all conceivable risks and then operate as quickly as human drivers is a contradiction in itself. Instead of minimizing all risks, we need to seriously discuss what kind of small risks should be allowed in well defined situations.
3) We should not underestimate the creativity of the market to deal with the remaining problems of self-driving cars. Many of the frequently-cited problems of self-driving cars have practical workarounds that don’t require that much intelligence (triple right-turns instead of a left turn, remote assistance to deal with the hand gesture problem, limit the first self-driving taxis to snow-free regions for the snow problem etc.).