Tesla has announced that it is removing ultrasonic sensors from its vehicles, including the Autopilot and Full Self-Driving systems, and transitioning to a camera-based system. The move began last year when Tesla started removing radar from its vehicles. The company plans to start with the Model 3 and Model Y, followed by the Model S and Model X next year.
The new system will utilize a "vision-based occupancy network" that draws information from eight cameras installed on the vehicles. Tesla claims that this new approach will provide the Autopilot with high-definition spatial positioning, longer range visibility, and the ability to identify and differentiate between objects. The company also assures owners of existing cars that they will temporarily lose some of their semi-autonomous features with the associated software update, but that they will be restored after the new software has been fully validated for safety.
The functions that will be temporarily lost include park assist, Autopark, Summon, and Smart Summon. However, AutoSteer, blind spot warning, and automatic emergency braking will continue to function as they currently do. Tesla believes that having multiple inputs can overwhelm a system with conflicting data, which is why the company has championed "Tesla Vision."
Tesla owners have already posted video evidence that the Pure Vision system can misread distances when stop signs are not standard size or mistake the Moon for a traffic light. However, Tesla argues that its occupancy network will continue to improve rapidly over time.
In summary, Tesla is transitioning from ultrasonic sensors to a camera-based system for its Autopilot and Full Self-Driving systems. This new approach will provide the Autopilot with high-definition spatial positioning, longer range visibility, and the ability to identify and differentiate between objects.