TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

  • The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
  • This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
  • The crashes are overwhelmingly Teslas rear-ending motorcyclists.

Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

  • @keesrif@lemmy.world
    link
    fedilink
    English
    01 year ago

    On a quick read, I didn’t see the struck motorcycles listed. Last I heard, a few years ago, was that this mainly affected motorcycles with two rear lights that are spaced apart and fairly low to the ground. I believe this is mostly true for Harleys.

    The theory I recall was that this rear light configuration made the Tesla assume it was looking (remember, only cameras without depth data) at a car that was further down the road - and acceleration was safe as a result. It miscategorised the motorcycle so badly that it misjudged it’s position entirely.

    • @jonne@infosec.pub
      link
      fedilink
      English
      01 year ago

      Whatever it is, it’s unacceptable and they should really ban Tesla’s implementation until they fix some fundamental issues.

    • KayLeadfootOP
      link
      fedilink
      01 year ago

      I also saw that theory! That’s in the first link in the article.

      The only problem with the theory: Many of the crashes are in broad daylight. No lights on at all.

      I didn’t include the motorcycle make and model, but I did find it. Because I do journalism, and sometimes I even do good journalism!

      The models I found are: Kawasaki Vulcan (a cruiser bike, just like the Harleys you describe), Yamaha YZF-R6 (a racing-style sport bike with high-mount lights), and a Yamaha V-Star (a “standard” bike, fairly low lights, and generally a low-slung bike). Weirdly, the bike models run the full gamut of the different motorcycles people ride on highways, every type is represented (sadly) in the fatalities.

      I think you’re onto something with the faulty depth sensors. Sensing distance is difficult with optical sensors. That’s why Tesla would be alone in the motorcycle fatality bracket, and that’s why it would always be rear-end crashes by the Tesla.

      • @grue@lemmy.world
        link
        fedilink
        English
        01 year ago

        Because I do journalism, and sometimes I even do good journalism!

        In that case, you wouldn’t happen to know whether or not Teslas are unusually dangerous to bicycles too, would you?

        • KayLeadfootOP
          link
          fedilink
          01 year ago

          Surprisingly, there is a data bucket for accidents with bicyclists, but hardly any bicycle crashes are reported.

          That either means that they are not occurring (woohoo!), or that means they are being lumped in as one of the multiple pedestrian buckets (not woohoo!), or they are in the absolutely fucking vast collection of “severity: unknown” accidents where we have no details and Tesla requested redaction to make finding the details very difficult.

    • @treadful@lemmy.zip
      link
      fedilink
      English
      01 year ago

      Still probably a good idea to keep an eye on that Tesla behind you. Or just let them past.

    • @ExcessShiv@lemmy.dbzer0.com
      link
      fedilink
      English
      0
      edit-2
      1 year ago

      The ridiculous thing is, it has 3 cameras pointing forward, you only need 2 to get stereoscopic depth perception with cameras…why the fuck are they not using that!?

      Edit: I mean, I know why, it’s because it’s cameras with three different lenses used for different things (normal, wide angle, and telescopic) so they’re not suitable for it, but it just seems stupid to not utilise that concept when you insist on a camera only solution.

      • amorpheus
        link
        fedilink
        English
        01 year ago

        That seems like a spectacular oversight. How is it supposed to replicate human vision without depth perception?

  • @Ledericas@lemm.ee
    link
    fedilink
    English
    01 year ago

    the cybertruck is sharp enough to cut a deer in half, surely a biker is just as vulnerable.

  • @spacesatan@leminal.space
    link
    fedilink
    English
    01 year ago

    Unless it’s a higher rate than human drivers per mile or hours driven I do not care. Article doesn’t have those stats so it’s clickbait as far as I’m concerned

    • @chetradley@lemm.ee
      link
      fedilink
      English
      01 year ago

      The fact that the other self driving brands logged zero motorcyclist fatalities means the technology exists to prevent more deaths. Tesla has chosen to allow more people to die in order to reduce cost. The families of those five dead motorcyclists certainly care.

    • KayLeadfootOP
      link
      fedilink
      01 year ago

      Thanks, 'Satan.

      Do you know the number of miles driven by Tesla’s self-driving tech? Because I don’t, Tesla won’t say, they’re a remarkably non-transparent company where their tech is concerned. Near as I can tell, nobody does (other than folks locked up tight with NDAs). If the ratio of accidents-per-mile-driven looked good, you know as a flat fact that Elon would be Tweeting all about it.

      Sorry you didn’t find the death of 5 Americans newsworthy. I’ll try harder for the next one.

    • @AA5B@lemmy.world
      link
      fedilink
      English
      01 year ago

      Same goes for the other vehicles. They didn’t even try to cover miles driven and it’s quite likely Tesla has far more miles of self-driving than anyone else.

      I’d even go so far as to speculate the zero accidents of other self-driving vehicles could just be zero information because we don’t have enough information to call it zero

      • KayLeadfootOP
        link
        fedilink
        01 year ago

        No, the zero accidents for other self-driving vehicles is actually zero :) You may have heard of this little boutique automotive manufacturer, Ford Motor Company. They’re one of the primary competitors, and they are far above the mileage where you would expect a fatal accident if they were as safe as a human.

        Ford has reported self-driving crashes (many of them!). Just no fatal crashes involving motorcycles, because I guess they don’t fucking suck at making self-driving software.

        I linked the data, it’s all public governmental data, and only the Tesla crashes are heavily redacted. You could… IDK… read it, and then share your opinion about it?

        • @AA5B@lemmy.world
          link
          fedilink
          English
          01 year ago

          And how did it compare self-driving time or miles? Because on the surface if Tesla is responsible for 5 such accidents and Ford zero, but Tesla has significantly more than five times the self-driving time or miles, then we just don’t have data yet …… and I see an announcement that Ford expects full self driving in 2026, so it can’t have been used much yet

  • @NotMyOldRedditName@lemmy.world
    link
    fedilink
    English
    0
    edit-2
    1 year ago

    For what it’s worth, it really isn’t clear if this is FSD or AP based on the constant mention of self driving even when it’s older collisions when it would definitely been AP, and is even listed as AP if you click on the links to the crash.

    So these may all be AP, or one or two might be FSD, it’s unclear.

    Every Tesla has AP as well, so the likelihood of that being the case is higher.

    • @AA5B@lemmy.world
      link
      fedilink
      English
      0
      edit-2
      1 year ago

      In this case, does it matter? Both are supposed to follow a vehicle at a safe distance

      I’d be more interested in how it changes over time, as new software is pushed. While it’s important that know it had problems judging distance to a motorcycle, it’s more important to know whether it still does

      • @NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        0
        edit-2
        1 year ago

        In this case, does it matter? Both are supposed to follow a vehicle at a safe distance

        I think it does matter, while both are supposed to follow at safe distances, the FSD stack is doing it in a completely different way. They haven’t really been making any major updates to AP for many years now, all focus has been on FSD. I think the only real changes it’s had for quite awhile have been around making sure people are paying attention better.

        AP is looking at the world frame by frame, each individual camera on it’s own, while FSD is taking the input of all cameras, turning into 3d vector space, and then driving based off that. Doing that on city streets and highways is only a pretty recent development. Updates for doing it this way on highway and streets only went out to all cars with FSD in the past few months. For a long time it was on city streets only.

        I’d be more interested in how it changes over time, as new software is pushed.

        I think that’s why it’s important to make a real distinction between AP and FSD today (and specifically which FSD versions)

        They’re wholly different systems, one that gets older every day, and one that keeps getting better every few months. Making an article like this that groups them together over the span of years muddies the water on what / if any progress has been made.

        • KayLeadfootOP
          link
          fedilink
          01 year ago

          Fair enough!

          At least one of the fatalities is Full-Self Driving (it was cited by name in the police reports). The remainder are Autopilot. So, both systems kill motorcyclists. Tesla requests this data redacted from their NHTSA reporting, which specifically makes it difficult for consumers to measure which system is safer or if incremental safety improvements are actually being made.

          You’re placing a lot if faith that the incremental updates are improvements without equivalent regressions. That data is specifically being concealed from you, and I think you should probably ask why. If there was good news behind those redactions, they wouldn’t be redactions.

          I didn’t publish the software version data point because I agree with AA5B, it doesn’t matter. I honestly don’t care how it works. I care that it works well enough to safely cohabit the road with my manual transmission cromagnon self.

          I’m not a “Tesla reporter,” I’m not trying to cover the incremental changes in their software versions. Plenty of Tesla fans doing that already. It only has my attention at all because it’s killing vulnerable road users, and for that analysis we don’t actually need to know which self-driving system version is killing people, just the make of car it is installed on.

          • @NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            0
            edit-2
            1 year ago

            I’d say it’s a pretty important distinction to know if one or both systems have a problem and the level of how bad that problem is.

            Also are you referencing the one in Seattle in 2024 for FSD? The CNBC article says FSD, but the driver said AP.

            And especially back then, there’s also an important distinction of how they work.

            FSD on highways wasn’t released until November 2024, and even then not everyone got it right away. So even if FSD was enabled, the crash may have been under AP.

            Edit: Also if it was FSD for real (that 2024 crash would have had to happen on city streets, not a highway) then thats 1 motorcycle fatality in 3.6 billion miles. The other 4 happened over 10 billion miles. Is that not an improvement? (edit again: I should say we can’t tell it’s an improvement yet as we’d have to pass 5 billion, so the jury is still out I guess IF that crash was really on FSD)

            Edit: I will cede though that as a motorcyclist, you can’t know what the Tesla is using, so you’d have to assume the worst.

              • @NotMyOldRedditName@lemmy.world
                link
                fedilink
                English
                0
                edit-2
                1 year ago

                The motorcyclist was killed on a freeway merge ramp.

                I’d say that means it’s a very good chance that yes, while FSD was enabled, the crash happened under the older AP mode of driving, as it wasn’t until November 2024 that it was moved over to the new FSD neural net driving code.. I was wrong here, it actually was FSD then, it just wasn’t end to end neural nets then like it is now.

                Also yikes… the report says the AEB kicked in, and the driver overrode it by pressing on the accelerator!

    • @psivchaz@reddthat.com
      link
      fedilink
      English
      01 year ago

      That’s not good though, right? “We have the technology to save lives, it works on all of our cars, and we have the ability to push it to every car in the fleet. But these people haven’t paid extra for it, so…”

      • @NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        0
        edit-2
        1 year ago

        Well, only 1 or 2 of those were in a time frame where I’d consider FSD superior to AP, it’s a more recent development where that’s likely the case.

        But to your point, at some point I expect Tesla to use the FSD software for AP for the exact reasons you mentioned. My guess is they’d just do something like disable making left/right turns , so you wouldn’t be able to use it outside of straight stretches like AP today.

  • @0x0@programming.dev
    link
    fedilink
    English
    0
    edit-2
    1 year ago

    This is news? Fortnine talked about it two years ago.
    TL;DR Tesla removed LIDAR to save a buck and the cameras see two red dots that the 'puter thinks it’s a far away car at night when indeed it’s a close motorcycle.

      • @AA5B@lemmy.world
        link
        fedilink
        English
        01 year ago

        Why not? It’s got multiple cameras so could judge distances the same way humans do.

        However there have been both hardware and software updates since most of those, so the critical question is how much of a problem is it still? The article had no info or speculation on that

    • @LesserAbe@lemmy.world
      link
      fedilink
      English
      01 year ago

      It’s helpful to remember that not everyone has seen the same stories you have. If we want something to change, like regulators not allowing dangerous products, then raising public awareness is important. Expressing surprise that not everyone knows about something can be counterproductive.

      Going beyond that, wouldn’t the new information here be the statistics?

      • bluGill
        link
        fedilink
        01 year ago

        like regulators not allowing dangerous products,

        I include human drivers in the list of dangerous products I don’t want allowed. The question is self driving safer overall (despite possible regressions like this). I don’t want regulators to pick favorites. I want them to find “the truth”

      • @JordanZ@lemmy.world
        link
        fedilink
        English
        01 year ago

        My state allowed motorcycle filtering in 2019 (not the same as California’s lane splitting). They ran a study and found a ton of motorcyclists were being severely injured or killed while getting rear ended sitting at stop lights. Filtering allows them to move to the front of the traffic light while the light is red and traffic is stationary. Many people are super aggravated about it even though most of the world has been doing it basically forever.

    • @TexasDrunk@lemmy.world
      link
      fedilink
      English
      01 year ago

      I’m on mine far more often than I’m in a car. I think Tesla found out that I point and laugh at any cyber trucks I see at red lights while I’m out and is trying to kill me.

  • AnimalsDream
    link
    fedilink
    English
    01 year ago

    I imagine bicyclists must be effected as well if they’re on the road (as we should be, technically). As somebody who has already been literally inches away from being rear-ended, this makes me never want to bike in the US again.

    Time to go to Netherlands.

  • Lka1988
    link
    fedilink
    English
    01 year ago

    Good to know, I’ll stay away from those damn things when I ride.

    • dual_sport_dork 🐧🗡️
      link
      fedilink
      English
      01 year ago

      I already do. Flip a coin: Heads, the car is operating itself and is therefore being operated by a moron. Tails, the owner is driving it manually and therefore it is being operated by a moron.

      Just be sure to carefully watch your six when you’re sitting at a stoplight. I’ve gotten out of the habit of sitting right in the center of the lane, because the odds are getting ever higher that I’ll have to scoot out of the way of some imbecile who’s coming in hot. That’s hard to do when your front tire is 24" away from the license plate of the car in front of you.

      • Lka1988
        link
        fedilink
        English
        01 year ago

        For me it depends which bike I’m riding. If it’s my 49cc scooter, I’ll sit to the very right side of the lane for a quick escape while watching my mirrors like a hawk. On my XR500, I’ll just filter to the front (legal in Utah).

        • @Korhaka@sopuli.xyz
          link
          fedilink
          English
          01 year ago

          I filter to the front on my leg powered bike, most traffic light setups here have a region for bikes at the front of the cars.

  • TrackinDaKraken
    link
    fedilink
    English
    01 year ago

    Five years ago, you could not have brought this up without Musk simps defending it.

  • @Critical_Thinker@lemm.ee
    link
    fedilink
    English
    01 year ago

    Let’s get this out of the way: Felon Musk is a nazi asshole.

    Anyway, It should be criminal to do these comparisons without showing human drivers statistics for reference. I’m so sick of articles that leave out hard data. Show me deaths per billion miles driven for tesla, competitors, and humans.

    Then there’s shit like the boca raton crash, where they mention the car going 100 in a 45 and killing a motorcyclist, and then go on to say the only way to do that is to physically use the gas pedal and that it disables emergency breaking. Is it really a self driving car at that point when a user must actively engage to disable portions of the automation? If you take an action to override stopping, it’s not self driving. Stopping is a key function of how self driving tech self drives. It’s not like the car swerved to another lane and nailed someone, the driver literally did this.

    Bottom line I look at the media around self driving tech as sensationalist. Danger drives clicks. Felon Musk is a nazi asshole, but self driving tech isn’t made by the guy. it’s made by engineers. I wouldn’t buy a tesla unless he has no stake in the business, but I do believe people are far more dangerous behind the wheel in basically all typical driving scenarios.

    • KayLeadfootOP
      link
      fedilink
      01 year ago

      In Boca Raton, I’ve seen no evidence that the self-driving tech was inactive. According to the government, it is reported as a self-driving accident, and according to the driver in his court filings, it was active.

      Insanely, you can slam on the gas in Tesla’s self-driving mode, accelerate to 100MPH in a 45MPH zone, and strike another vehicle, all without the vehicle’s “traffic aware” automation effectively applying a brake.

      That’s not sensationalist. That really is just insanely designed.

      • @Critical_Thinker@lemm.ee
        link
        fedilink
        English
        01 year ago

        FTFA:

        Certain Tesla self-driving technologies are speed capped, but others are not. Simply pressing the accelerator will raise your speed in certain modes, and as we saw in the police filings from the Washington State case, pressing the accelerator also cancels emergency braking.

        That’s how you would strike a motorcyclist at such extreme speed, simply press the accelerator and all other inputs are apparently overridden.

        If the guy smashes the gas, just like in cruise control I would not expect the vehicle to stop itself.

        The guy admitted to being intoxicted and held the gas down… what’s the self driving contribution to that?

        • KayLeadfootOP
          link
          fedilink
          01 year ago

          I know what’s in the article, boss. I wrote it. No need to tell me FTFA.

          TACC stands for Traffic Aware Cruise Control. If I have a self-driving technology like TACC active, and the car’s sensor suite detects traffic immediately in front of me, I would expect it to reduce speed (as is its advertised function). I would expect that to override gas pedal input, because the gas pedal sets your maximum speed in cruise control, but the software should still function as advertised and not operate at the maximum speed.

          I would not expect it to fail to detect the motorcyclist and plow into them at speed. I think we can all agree that is a bad outcome for a self-driving system.

          Here’s the manual, if you’re curious. It doesn’t work in bright sunlight, fog, excessively curvy roads (???), situations with oncoming headlights (!?!), or if your cameras are dirty or covered with a sticker. They also helpfully specify that “The list above does not represent an exhaustive list of situations that may interfere with proper operation of Traffic-Aware Cruise Control,” so it’s all that shit, and anything else - if you die or kill somebody, you have just found another situation that may interfere with proper function of the TACC system.

          https://www.tesla.com/ownersmanual/2012_2020_models/en_us/GUID-50331432-B914-400D-B93D-556EAD66FD0B.html#:~:text=Traffic-Aware Cruise Control determines,maintains a set driving speed.

          • @Critical_Thinker@lemm.ee
            link
            fedilink
            English
            01 year ago

            So do you expect self driving tech to override human action? or do you expect human action to override self driving tech?

            I expect the human to override the system, not the other way around. Nobody claims to have a system that requires no human input, aside from limited and experimental implementations that are not road legal nationwide. I kind of expect human input to override the robot given the fear of robots making mistakes despite the humans behind them getting into them drunk and holding down the throttle until they turn motorcyclists into red mist. But that’s my assumption.

            With the boca one specifically, the guy got in his car inebriated. That was the first mistake that caused the problem that should never have happened. If the car was truly self driving automated and had no user input, this wouldn’t have happened. It wouldn’t have gone nearly 2.5x the speed limit. It would have braked long in advance before hitting someone in the road.

            I have a ninja 650. We all know the danger comes from things we cannot control, such as others. I’d trust an actually automated car over a human driver always, even with limited modern tech. The second the user gets an input though? zero trust.

            • KayLeadfootOP
              link
              fedilink
              01 year ago

              The driver being drunk doesn’t mean the self-driving feature should not detect motorcycles. The human is a fallback to the tech. The tech had to fail for this fatal crash to occur.

              If the system is advertised as overrriding the human speed inputs ( traffic aware cruise control, it is supposed to brake when it detects traffic, regardless of pedal inputs), then it should function as advertised.

              Incidentally, I agree, I broadly trust automated cars to act more predictably than human drivers. In the case of specifically Teslas and specifically motorcycles, it looks like something is going wrong. That’s what the data says, anyhow. If the government were functioning how it should, the tech would be disabled during the investigation, which is ongoing.

    • @Nastybutler@lemmy.world
      link
      fedilink
      English
      01 year ago

      He may not be an engineer, but he’s the one who made the decision to use strictly cameras rather than lidar, so yes, he’s responsible for these fatalities that other companies don’t have. You may not be a fan of Musk, but it sounds like you’re a fan of Tesla