Autopilot
What Elon Musk Said On 1st Indian-Origin Employee On Tesla's Autopilot Team
Tesla founder and CEO Elon Musk, who has been using social media to recruit people, has disclosed that Indian-origin Ashok Elluswamy was the first employee to be hired for his electric vehicle company's Autopilot team.
"Ashok was the first person recruited from my tweet saying that Tesla is starting an Autopilot team!" Mr Musk said in a tweet in reply to a video on his interview.
He said that Ashok is actually the head of Autopilot engineering, reports NDTV.
"Andrej is director of AI; People often give me too much credit and give Andrej too much credit. The Tesla Autopilot AI team is extremely talented. Some of the smartest people in the world,” he said.
Before joining Tesla, Mr Elluswamy was associated with Volkswagen Electronic Research Lab and WABCO Vehicle Control System.
He holds a bachelor's degree in Electronics and Communication Engineering from the College of Engineering Guindy, Chennai and a Master's degree in Robotics System Development from Carnegie Mellon University.
Mr Musk recently tweeted that Tesla is looking for hardcore Artificial Intelligence (AI) engineers who care about solving problems that directly affect people's lives in a major way.
READ: Time magazine’s ‘Person of the Year’ is Elon Musk
The job application was simple with interested candidates were asked to fill fields like name, email, exceptional work done in software, hardware or AI and drop their resume in the PDF format.
Mr Musk is the wealthiest person in the world, according to Forbes, with a net worth of around $282 billion, most of it in Tesla stock.
READ: Elon Musk tweets to ask if he should sell some Tesla stock
2 years ago
US probing Autopilot problems on 765,000 Tesla vehicles
The U.S. government has opened a formal investigation into Tesla’s Autopilot partially automated driving system after a series of collisions with parked emergency vehicles.
The investigation covers 765,000 vehicles, almost everything that Tesla has sold in the U.S. since the start of the 2014 model year. Of the crashes identified by the National Highway Traffic Safety Administration as part of the probe, 17 people were injured and one was killed.
NHTSA says it has identified 11 crashes since 2018 in which Teslas on Autopilot or Traffic Aware Cruise Control have hit vehicles at scenes where first responders have used flashing lights, flares, an illuminated arrow board or cones warning of hazards. The agency announced the action Monday in a posting on its website.
The probe is another sign that NHTSA under President Joe Biden is taking a tougher stance on automated vehicle safety than under previous administrations. Previously the agency was reluctant to regulate the new technology for fear of hampering adoption of the potentially life-saving systems.
READ: NTSB: Tesla Autopilot, distracted driver caused fatal crash
The investigation covers Tesla’s entire current model lineup, the Models Y, X, S and 3 from the 2014 through 2021 model years.
The National Transportation Safety Board, which also has investigated some of the Tesla crashes dating to 2016, has recommended that NHTSA and Tesla limit Autopilot’s use to areas where it can safely operate. The NTSB also recommended that NHTSA require Tesla to have a better system to make sure drivers are paying attention. NHTSA has not taken action on any of the recommendations. The NTSB has no enforcement powers and can only make recommendations to other federal agencies.
“Today’s action by NHTSA is a positive step forward for safety,” NTSB Chair Jennifer L. Homendy said in a statement Monday. “As we navigate the emerging world of advanced driving assistance systems, it’s important that NHTSA has insight into what these vehicles can, and cannot, do.”
Last year the NTSB blamed Tesla, drivers and lax regulation by NHTSA for two collisions in which Teslas crashed beneath crossing tractor-trailers. The NTSB took the unusual step of accusing NHTSA of contributing to the crash for failing to make sure automakers put safeguards in place to limit use of electronic driving systems.
The agency made the determinations after investigating a 2019 crash in Delray Beach, Florida, in which the 50-year-old driver of a Tesla Model 3 was killed. The car was driving on Autopilot when neither the driver nor the Autopilot system braked or tried to avoid a tractor-trailer crossing in its path.
“We are glad to see NHTSA finally acknowledge our long standing call to investigate Tesla for putting technology on the road that will be foreseeably misused in a way that is leading to crashes, injuries, and deaths,” said Jason Levine, executive director of the nonprofit Center for Auto Safety, an advocacy group. “If anything, this probe needs to go far beyond crashes involving first responder vehicles because the danger is to all drivers, passengers, and pedestrians when Autopilot is engaged.”
Autopilot has frequently been misused by Tesla drivers, who have been caught driving drunk or even riding in the back seat while a car rolled down a California highway.
A message was left seeking comment from Tesla, which has disbanded its media relations office. Shares of Tesla Inc., based in Palo Alto, California, fell 4.3% Monday.
NHTSA has sent investigative teams to 31 crashes involving partially automated driver assist systems since June of 2016. Such systems can keep a vehicle centered in its lane and a safe distance from vehicles in front of it. Of those crashes, 25 involved Tesla Autopilot in which 10 deaths were reported, according to data released by the agency.
Tesla and other manufacturers warn that drivers using the systems must be ready to intervene at all times. In addition to crossing semis, Teslas using Autopilot have crashed into stopped emergency vehicles and a roadway barrier.
The probe by NHTSA is long overdue, said Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University who studies automated vehicles.
Tesla’s failure to effectively monitor drivers to make sure they’re paying attention should be the top priority in the probe, Rajkumar said. Teslas detect pressure on the steering wheel to make sure drivers are engaged, but drivers often fool the system.
“It’s very easy to bypass the steering pressure thing,” Rajkumar said. “It’s been going on since 2014. We have been discussing this for a long time now.”
READ: 3 crashes, 3 deaths raise questions about Tesla's Autopilot
The crashes into emergency vehicles cited by NHTSA began on Jan. 22, 2018 in Culver City, California, near Los Angeles when a Tesla using Autopilot struck a parked firetruck that was partially in the travel lanes with its lights flashing. Crews were handling another crash at the time.
Since then, the agency said there were crashes in Laguna Beach, California; Norwalk, Connecticut; Cloverdale, Indiana; West Bridgewater, Massachusetts; Cochise County, Arizona; Charlotte, North Carolina; Montgomery County, Texas; Lansing, Michigan; and Miami, Florida.
“The investigation will assess the technologies and methods used to monitor, assist and enforce the driver’s engagement with the dynamic driving task during Autopilot operation,” NHTSA said in its investigation documents.
In addition, the probe will cover object and event detection by the system, as well as where it is allowed to operate. NHTSA says it will examine “contributing circumstances” to the crashes, as well as similar crashes.
An investigation could lead to a recall or other enforcement action by NHTSA.
“NHTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves,” the agency said in a statement. “Every available vehicle requires a human driver to be in control at all times, and all state laws hold human drivers responsible for operation of their vehicles.”
The agency said it has “robust enforcement tools” to protect the public and investigate potential safety issues, and it will act when it finds evidence “of noncompliance or an unreasonable risk to safety.”
In June, NHTSA ordered all automakers to report any crashes involving fully autonomous vehicles or partially automated driver assist systems.
Tesla uses a camera-based system, a lot of computing power, and sometimes radar to spot obstacles, determine what they are, and then decide what the vehicles should do. But Carnegie Mellon’s Rajkumar said the company’s radar was plagued by “false positive” signals and would stop cars after determining overpasses were obstacles.
Now Tesla has eliminated radar in favor of cameras and thousands of images that the computer neural network uses to determine if there are objects in the way. The system, he said, does a very good job on most objects that would be seen in the real world. But it has had trouble with parked emergency vehicles and perpendicular trucks in its path.
“It can only find patterns that it has been quote-unquote trained on,” Rajkumar said. “Clearly the inputs that the neural network was trained on just do not contain enough images. They’re only as good as the inputs and training. Almost by definition, the training will never be good enough.”
Tesla also is allowing selected owners to test what it calls a “full self-driving” system. Rajkumar said that should be investigated as well.
3 years ago
Scrutiny of Tesla crash a sign that regulation may be coming
The fiery crash of a Tesla near Houston with no one behind the wheel is drawing scrutiny from two federal agencies that could bring new regulation of electronic systems that take on some driving tasks.
The National Highway Traffic Safety Administration and the Natioanal Transportation Safety board said Monday they would send teams to investigate the Saturday night crash on a residential road that killed two men in a Tesla Model S.
Local authorities said one man was found in the passenger seat, while another was in the back. They’re issuing search warrants in the probe, which will determine whether the Tesla’s Autopilot partially automated system was in use. Autopilot can keep a car centered in its lane, keep a distance from cars in front of it, and can even change lanes automatically in some circumstances.
On Twitter Monday, Tesla CEO Elon Musk wrote that data logs “recovered so far” show Autopilot wasn’t turned on, and “Full Self-Driving” was not purchased for the vehicle. He didn’t answer reporters’ questions posed on Twitter.
Also read: NTSB: Tesla Autopilot, distracted driver caused fatal crash
In the past, NHTSA, which has authority to regulate automakers and seek recalls for defective vehicles, has taken a hands-off approach to regulating partial and fully automated systems for fear of hindering development of promising new features.
But since March, the agency has stepped up inquiries into Teslas, dispatching teams to three crashes. It has investigated 28 Tesla crashes in the past few years, but thus far has relied on voluntary safety compliance from auto and tech companies.
“With a new administration in place, we’re reviewing regulations around autonomous vehicles,” the agency said last month.
Agency critics say regulations — especially of Tesla — are long overdue as the automated systems keep creeping toward being fully autonomous. At present, though, there are no specific regulations and no fully self-driving systems available for sale to consumers in the U.S.
At issue is whether Musk has over-sold the capability of his systems by using the name Autopilot or telling customers that “Full Self-Driving” will be available this year.
“Elon’s been totally irresponsible,” said Alain Kornhauser, faculty chair of autonomous vehicle engineering at Princeton University. Musk, he said, has sold the dream that the cars can drive themselves even though in the fine print Tesla says they’re not ready. “It’s not a game. This is serious stuff.”
Tesla, which has disbanded its media relations office, also did not respond to requests for comment Monday. Its stock fell 3.4% in the face of publicity about the crash.
In December, before former President Donald Trump left office, NHTSA sought public comment on regulations. Transportation Secretary Elaine Chao, whose department included NHTSA, said the proposal would address safety “without hampering innovation in development of automated driving systems.”
Also read: NTSB releases details in 2 crashes involving Tesla Autopilot
But her replacement under President Joe Biden, Pete Buttigieg, indicated before Congress that change might be coming.
“I would suggest that the policy framework in the U.S. has not really caught up with the technology platforms,” he said last month. “So we intend to pay a lot of attention for that and do everything we can within our authorities,” he said, adding that the agency may work with Congress on the issue.
Tesla has had serious problems with Autopilot, which has been involved in several fatal crashes where it failed to stop for tractor-trailers crossing in front of it, stopped emergency vehicles, or a highway barrier. The NTSB, which can only issue recommendations, asked that NHTSA and Tesla limit the system to roads on which the system can safely operate, and that Tesla install a more robust system to monitor drivers to make sure they’re paying attention. Neither Tesla nor the agency took action, drawing criticism and blame for one of the crashes from the NTSB.
Missy Cummings, an electrical and computer engineering professor at Duke University who studies automated vehicles, said the Texas crash is a watershed moment for NHTSA.
She’s not optimistic the agency will do anything substantial, but hopes the crash will bring change. “Tesla has had such a free pass for so long,” she said.
Frank Borris, a former head of NHTSA’s Office of Defects Investigation who now runs a safety consulting business, said the agency is in a tough position because of a slow, outdated regulatory process that can’t keep up with fast-developing technology.
The systems holds great promise to improve safety, Borris said. But it’s also working with “what is an antiquated regulatory rule promulgating process which can take years.”
Investigators in the Houston-area case haven’t determined how fast the Tesla was driving at the time of the crash, but Harris County Precinct Four Constable Mark Herman said it was a high speed. He would not say if there was evidence that anyone tampered with Tesla’s system to monitor the driver, which detects force from hands on the wheel. The system will issue warnings and eventually shut the car down if it doesn’t detect hands. But critics say Tesla’s system is easy to fool and can take as long as a minute to shut down.
The company has said in the past that drivers using Autopilot and the company’s “Full Self-Driving Capability” system must be ready to intervene at any time, and that neither system can drive the cars itself.
On Sunday, Musk tweeted that the company had released a safety report from the first quarter showing that Tesla with Autopilot has nearly a 10 times lower chance of crashing than the average vehicle with a human piloting it.
But Kelly Funkhouser, head of connected and automated vehicle testing for Consumer Reports, said Tesla’s numbers have been inaccurate in the past and are difficult to verify without underlying data.
“You just have to take their word for it,” Funkhouser said, adding that Tesla doesn’t say how many times the system failed but didn’t crash, or when a driver failed to take over.
Funkhouser said it’s time for the government to step in, set performance standards and draw a line between partially automated systems that require drivers to intervene and systems that can drive themselves.
“There is no metric, there is no yes or no, black or white,” she said. She fears that Tesla is asserting that it’s not a testing autonomous vehicles or putting self-driving cars on the road, while “getting away with using the general population of Tesla owners as guinea pigs to test the system.”
3 years ago