Categories
Mobile Syrup

U.S. Justice Department investigating Tesla self-driving claims

The U.S. Department of Justice reportedly launched a criminal probe into Tesla’s self-driving vehicle claim.

According to Reuters, three people familiar with the matter told the publication about the probe. The previously undisclosed investigation started last year following over a dozen crashes, some fatal, involving Tesla’s Autopilot automated driving technology. The people told Reuters that Autopilot was activated during the accidents.

Tesla CEO Elon Musk has repeatedly and publicly promised that self-driving cars were coming but hasn’t yet delivered. As noted by The Verge, Musk went from saying Tesla would have 1 million robotaxis on the road by the end of the year to 1 million people in the Full Self-Driving (FSD) beta. However, those are very different things.

Tesla vehicles come with a driver assistant feature called Autopilot, but for an extra $19,500 in Canada, customers can upgrade it to FSD. But despite what Musk has said and the arguably misleading name, FSD still requires driver supervision. Tesla’s website notes as much when you select the FSD add-on, which Reuters says could complicate the Justice Department’s case.

FSD has been a tricky feature for Tesla. Fans love it (so much so that some have sought to put children in harm’s way to prove it works), and critics have repeatedly pointed out safety concerns with Tesla using regular people to beta test FSD. The U.S. National Highway Traffic Safety Administration (NHTSA) is investigating 16 crashes where Tesla vehicles using Autopilot crashed into stationary emergency vehicles, leading to 15 injuries and one fatality.

Moreover, regulators have accused Tesla of false advertising, and customers have sued the company for allegedly misleading them about the capabilities of FSD.

So far, most of this seems to have had little effect on Tesla or Musk, but a Justice Department investigation carries the risk that Tesla or its executives will be charged criminally. Reuters reports that federal prosecutors in Washington and San Francisco are investigating whether Tesla’s claims about Autopilot and FSD misled customers.

Source: Reuters, The Verge

Categories
Mobile Syrup

Tesla wants ‘defamatory’ videos of its vehicles striking child mannequins taken down

Tesla wants videos showing its vehicles running over child-sized mannequins taken down, claiming the videos are “defamatory.”

Per The Washington Post and The Verge, Tesla sent a letter to an advocacy group alleging the videos misrepresent the capabilities of its Full Self-Driving (FSD) beta software. This comes after the advocacy group, called the ‘Dawn Project,’ published videos of testing it did with Tesla’s FSD software. The group claimed FSD failed to detect child mannequins and routinely struck them in its tests, as shown in the videos.

However, the videos immediately generated significant controversy, with many leaping to Tesla and CEO Elon Musk’s defence. Some fans went so far as to seek out actual children to stand in front of their cars to prove that FSD could, in fact, detect and stop for children and that the tests didn’t work because testers used mannequins.

Alongside the deranged response from Musk fans, other critical observers noted potential flaws in the testing, such as one video clip that appeared to show that FSD wasn’t activated properly during the test. The Dawn Project released additional footage that showed FSD was activated, but there are still inconsistencies.

Others have raised concerns over Dawn Project founder Dan O’Dowd’s motives. O’Dowd launched a Senate campaign in California that was explicitly focused on Tesla’s FSD, and O’Dowd also runs Green Hill Software, which does business with Tesla competitors like General Motors, BMW, and Ford, according to The Verge. The videos released by the Dawn Project were part of an advertising campaign intended to sway the U.S. Congress to ban Tesla’s FSD.

Interestingly, The Verge noted that Tesla doesn’t appear to have filed objections to fan videos using real children to test FSD, although the letter comes days after YouTube removed several such videos. However, Tesla described the Dawn Project videos as portraying “unsafe and improper use of FSD Beta and active safety features” in the letter.

In response to the letter, O’Dowd called “Master Scammer Musk” a “crybaby.”

Regardless of your thoughts on O’Dowd and the Dawn Project, the concerns raised about FSD are legitimate. Critics have taken issue with Tesla allowing regular people to test beta software on public roads, and there have been calls for regulators to step in. Others have accused Tesla of being misleading with the name and marketing of FSD. Musk has repeatedly promised FSD would become fully autonomous, but as of now, the system still requires the driver to stay engaged and ready to stop or correct the vehicle. Tesla also fired an employee over videos they uploaded showing issues with FSD.

Source: The Verge, Washington Post

Categories
Mobile Syrup

Safety tests show Tesla vehicles ‘repeatedly’ fail to detect children

Studies show Tesla’s ‘Full Self-Driving’ (FSD) Beta technology fails to recognize children, further building on concerns about its safety as the company makes it available to more users.

First, a safety test conducted by the ‘Dawn Project‘ (via The Guardian) found that a Tesla Model 3 with FSD “repeatedly struck [a] child mannequin in a manner that would be fatal to an actual child.” Dawn Project seeks to improve the safety and reliability of software by stopping the use of commercial-grade software in safety-critical systems.

Further, investor Taylor Ogan shared a short video on Twitter showing a comparison between a Tesla and a vehicle equipped with LiDAR tech from Luminar — in the video, the Tesla hits the child mannequin while the LiDAR-equipped car manages to stop. In follow-up tweets, Ogan criticizes Tesla for not adopting LiDAR technology for its autonomous vehicle software.

LiDAR, for those unfamiliar with the term, refers to light detection and ranging or laser imaging, detection, and ranging. The tech allows for determining the range between two things by bouncing a laser off one object and measuring the time it takes for the laser to return.

The Dawn Project test will form part of an advertising campaign intended to encourage U.S. Congress to ban Tesla’s FSD.

Tesla and CEO Elon Musk have so far disputed concerns over the safety of FSD. At the same time, the U.S. National Highway Traffic Safety Administration (NHTSA) has launched investigations and requested information from the company about FSD. It’s worth noting that Tesla has also recently made FSD available in Canada.

A common line of defence appears to be claiming that FSD still requires driver assistance and is not fully autonomous. And while Tesla does state this on its website, the name — Full Self-Driving — suggests otherwise. Moreover, Tesla made the software available to thousands of Tesla owners to use on public roads, many of whom have misused FSD. Tesla has also delayed or pulled FSD updates over bugs and other issues several times and even fired an employee who shared a video of flaws with the FSD system.

There are clear safety concerns at play here, and critics have highlighted these concerns in an attempt to get governments to regulate the use of autonomous driving systems on public roads until the systems are safer and more reliable. Tesla fans have responded by attacking these critics online, with one Twitter user going so far as to request a child volunteer to run in front of their FSD-equipped Tesla to prove it will stop.

“I promise I won’t run them over,” the person wrote. Yea, sure bud.

Source: Dawn Project, Taylor Ogan (Twitter) Via: The Guardian

Categories
Mobile Syrup

Elon Musk tweets Tesla FSD price will increase to $12,000 but only in U.S.

Tesla CEO Elon Musk is back on Twitter again, this time to announce that the company will raise the price of its ‘Full Self-Driving’ (FSD) software to $12,000 on January 17th.

Thankfully, the price hike will not impact Canadians — Musk followed up his original tweet noting the change was “Just in the US.” It’s somewhat surprising, given the similarity in the package’s pricing between the countries — U.S. customers currently pay $10,000 USD for FSD while Canadians pay $10,600 CAD ($10,000 USD is worth about $12,645 CAD).

Regardless, it’s good news for any prospective Tesla customers in Canada (and bad news for any U.S.-based Tesla customers). Still, Musk does have a tendency to change pricing on a whim. In October, Tesla hiked the price of its supposedly more affordable Model 3 by almost $3,400 over two weeks, bringing that car to just $10 shy of the federal EV rebate limit of $55,000 (once you factor out fees for delivery, air conditioning and other items).

Tesla also upped the price of its Model Y in October 2021 and in 2020, dropped the price of the Model S in Canada after Musk changed the price to $69,420 in the U.S. (The Model S price has changed since thanks in part to the release of the ‘Plaid‘ version.)

Musk followed up his FSD price increase tweet by noting that the FSD price would continue to rise as the company gets closer to the “production code release.” That likely means Canadians will see the FSD price increase in the future, even if the price isn’t changing at the moment.

It’s worth noting that the FSD software is still in beta despite Musk’s various promises over the years that it would be available by now (The Verge notes that Musk said FSD would exit beta in 2018 and in 2019 said it’d be on “over a million cars” in 2020).

We’re in 2022 now and over the last few months, the FSD beta has drawn increased scrutiny and criticism from regulators and reporters. Concerns stem from the decision to let regular people beta-test the FSD software — reasonable, considering people keep posting videos of them misusing the software. There are also concerns with how Tesla represents FSD, with some calling the full self-driving name misleading.

Source: Elon Musk (Twitter) Via: The Verge

Categories
Mobile Syrup

Tesla vehicle in Full Self-Driving mode reportedly crashed in California

A Tesla Model Y reportedly crashed earlier this month in California while in the Full Self-Driving (FSD) beta mode.

As reported by The Verge, the November 3rd crash happened in Brea, a city southeast of Los Angeles. It marks likely the first incident involving the company’s controversial driver assist feature. No one was injured in the crash, although the vehicle allegedly was “severely damaged.”

The Verge notes that the crash was reported to the National Highway Traffic Safety Administration (NHTSA), which currently has multiple overlapping investigations into Tesla’s Autopilot system. The incident report appears to be made by the owner of the Model Y — you can read a snippet published by The Verge below:

“The Vehicle was in FSD Beta mode and while taking a left turn the car went into the wrong lane and I was hit by another driver in the lane next to my lane. the car gave an alert 1/2 way through the turn so I tried to turn the wheel to avoid it from going into the wrong lane but the car by itself took control and forced itself into the incorrect lane creating an unsafe maneuver putting everyone involved at risk. car is severely damaged on the driver side.”

Tesla has come under increased scrutiny and criticism for testing its FSD beta software with untrained vehicle owners on public roads. There are several issues with the process Tesla has chosen for the FSD beta. Instead of trained drivers, the company gathered data on Tesla drivers, gave them safety scores and then provided FSD access to those who had high scores. Moreover, Tesla has also repeatedly delayed or even reverted software updates due to bugs and other concerns.

However, perhaps one of the most significant issues is the misleading name. Full Self-Driving is not an autonomous driving system, and motorists need to pay attention and keep their hands on the steering wheel to correct the car when it makes mistakes.

It’s also worth keeping in mind that we do not yet know the full details of this particular incident. I’ve seen some people already calling parts of the crash report into question. While it’s possible — likely even — that new details will emerge as more information becomes available, it doesn’t mean that concerns about testing FSD on public roads are unreasonable or illegitimate. As FSD becomes available to more people for testing, the likelihood of incidents like this will go up. While some Tesla drivers signed up to try FSD, other drivers have not.

Source: The Verge

Categories
Mobile Syrup

Musk delays Tesla Full Self-Driving beta release over ‘last minute concerns’

Earlier this week, Tesla CEO Elon Musk said on Twitter that the company’s Full Self-Driving (FSD) beta would release to about 1,000 people. Musk has now backtracked, citing “last minute concerns about this build.”

Instead of releasing on Friday as expected, Musk tweeted Saturday morning that the FSD beta would roll out “likely on Sunday or Monday” instead.

Tesla’s FSD software was set to roll out to roughly 1,000 people who met the company’s safety requirements. The company determines drivers’ ‘safety score’ via data collected by sensors built into Tesla vehicles. Initially, the FSD beta would roll out to those with a perfect safety score (100 points out of 100 total), followed by a gradual rollout to those with scores of 99/100 and below.

Tesla launched the safety score system alongside the ability for vehicle owners to request FSD beta access. The idea appears to be that safe drivers — as determined by Tesla — will be able to better handle testing FSD, which requires driver supervision.

It’s worth noting that FSD does not make Tesla vehicles fully autonomous. Musk previously said that the feature-complete version will “likely” be able to drive someone from their home to work without intervention, and will still require supervision.

Musk did not elaborate on what the “concerns” were with the FSD beta build that delayed the rollout. Ultimately, I’m not surprised to see a delay.

Source: Elon Musk (Twitter) Via: The Verge