Off the wire
Brazil ratifies Paris climate change agreement  • Pentagon confirms death of IS spokesman  • Brazil's Chamber of Deputies to decide fate of its former president  • Feature: People in Syria's Aleppo receive nationwide truce with different opinions  • Chicago agricultural commodities settle mixed  • Xue wins table tennis class 3 gold in Rio  • Feature: Egyptian Muslims celebrate Feast of Sacrifice  • U.S. stocks rebound after Fed's Brainard speech  • Gold falls despite easing expectation of U.S. rate hike  • Clinton to release more medical details after health scare  
You are here:   Home

Tesla upgrades Autopilot, with radar as primary control sensor

Xinhua, September 13, 2016 Adjust font size:

Tesla Motors, Inc. on Monday announced software upgrade for the Autopilot mode of its electric vehicles, turning an onboard radar into a primary control sensor.

The new software includes dozens of small refinements plus what the automaker called "significant upgrade" in the wake of the May 7 fatal crash of a Tesla S, a luxury model, on a road in Williston, Florida, southeastern United States.

In the first fatality on record operating a Tesla electric vehicle on Autopilot mode, neither Autopilot nor the driver noticed a tractor trailer turning in front of the vehicle, so the brake was not applied, resulting in the sedan to pass under the trailer.

Tesla, based in northern California, did not mention the crash in a blog on its website, but said the upgrade would bring more advanced signal processing to create "a picture of the world" with an onboard radar, which was added to Tesla vehicles in October 2014 as part of the Autopilot hardware suite.

Initially meant to be a supplementary sensor to the primary camera and image processing system, the radar would be used as a primary control sensor without requiring the camera to confirm visual image recognition.

However, the automaker said, among the shortfalls with radar imagery is that any metal surface with a dish shape is not only reflective, but also amplifies the reflected signal to many times its actual size.

A discarded soda can on the road, with its concave bottom facing towards you can appear to be a large and dangerous obstacle, but drivers would definitely not want to slam on the brakes to avoid it.

To solve the problem, the software upgrade will assemble radar snapshots, which take place every tenth of a second, into a 3D "picture" and will compare several contiguous frames against vehicle velocity and expected path to tell if something is real and assess the probability of collision.

In addition, when the car is approaching an overhead highway road sign positioned on a rise in the road or a bridge where the road dips underneath, which often looks like a collision course to the radar, Tesla's vehicle fleet will initially take no action except to note the position of road signs, bridges and other stationary objects, mapping the world according to radar, and the car computer will then compare when it would have braked to the driver action and upload that to the Tesla database.

In what is known as "fleet learning," Tesla noted, "if several cars drive safely past a given radar object, whether Autopilot is turned on or off, then that object is added to the geocoded whitelist."

And when the data shows that false braking events would be rare, the car will begin mild braking using radar, even if the camera doesn't notice the object ahead.

As the system confidence level rises, the braking force will gradually increase to full strength when it is about 99.99 percent certain of a collision. "This may not always prevent a collision entirely, but the impact speed will be dramatically reduced."

Confident about the upgrade, the company claimed that "the car should almost always hit the brakes correctly even if a UFO were to land on the freeway in zero visibility conditions." Endit