TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

  • The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
  • This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
  • The crashes are overwhelmingly Teslas rear-ending motorcyclists.

Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    82
    ·
    edit-2
    7 days ago

    Hey guys relax! It’s all part of the learning experience of Tesla FSD.
    Some of you may die, but that’s a sacrifice I’m willing to make.

    Regards
    Elon Musk
    CEO of Tesla

    • Gammelfisch@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      7 days ago

      +1 for you. However, replace “Regards” with the more appropriate words from the German language. The first with an S, and the second an H. I will not type that shit, fuck Leon and I hope the fucking Nazi owned Tesla factory outside of Berlin closes.

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        7 days ago

        Yes I’m not writing that shit, even in a sarcastic post. Bu I get your drift.
        On the other hand, since you are from Germany, VW group is absolutely killing it on EV recently IMO.
        They totally dominate top 10 EV here in Denmark, with 7 out of 10 top selling models!!
        They are competitively priced, and they are the best combination of quality and range in their price ranges.

  • Gork@lemm.ee
    link
    fedilink
    English
    arrow-up
    67
    ·
    7 days ago

    Lidar needs to be a mandated requirement for these systems.

    • ℍ𝕂-𝟞𝟝@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      2
      ·
      7 days ago

      Honestly, emergency braking with LIDAR is mature and cheap enough at this point that is should be mandated for all new cars.

      • Nastybutler@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        7 days ago

        No, emergency braking with radar is mature and cheap. Lidar is very expensive and relatively nascent

    • TrackinDaKraken@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      7 days ago

      How about we disallow it completely, until it’s proven to be SAFER than a human driver. Because, why even allow it if it’s only as safe?

      • explodicle@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 days ago

        As an engineer, I strongly agree with requirements based on empirical results rather than requiring a specific technology. The latter never ages well. Thank you.

        • scarabic@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          7 days ago

          It’s hardly either / or though. What we have here is empirical data showing that cars without lidar perform worse. So it’s based in empirical results to mandate lidar. You can build a clear, robust requirement around a tech spec. You cannot build a clear, robust law around fatality statistics targets.

          • explodicle@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 days ago

            We frequently build clear, robust laws around mandatory testing. Like that recent YouTube video where the Tesla crashed through a wall, but with crash test dummies.

            • scarabic@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              6 days ago

              Those are ways to gather empirical results, though they rely on artificial, staged situations.

              I think it’s fine to have both. Seat belts save lives. I see no problem mandating them. That kind of thing can still be well founded in data.

      • scarabic@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 days ago

        This sounds good until you realize how unsafe human drivers are. People won’t accept a self-driving system that’s only 50% safer than humans, because that will still be a self-driving car that kills 20,000 Americans a year. Look at the outrage right here, and we’re nowhere near those numbers. I also don’t see anyone comparing these numbers to human drivers on any per-mile basis. Waymos compared favorably to human drivers in their most recently released data. Does anyone even know where Teslas stand compared to human drivers?

        • NotMyOldRedditName@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          6 days ago

          There’s been 54 reported fatalities involving their software over the years in the US.

          That’s around 10 billion AP miles (9 billion at end of 2024), and around 3.6 billion on the various version of FSD (beta / supervised). Most of the fatal accidents happened on AP though not FSD.

          Lets just double those fatal accidents to 108 to make it for the world, but that probably skews high. Most of the fatal stuff I’ve seen is always in the US.

          That equates to 1 fatal accident every 125.9 million miles.

          The USA average per 100 million miles is 1.33 deaths, so even doubling the deaths it’s less than the current national average. That’s the equivalent of 1.33 deaths every 167 million miles with Tesla’s software.

          Edit: I couldn’t math, fixed it. Also for FSD specifically, very few places have it. Mainly North America, and just recently, China. I wish we had fatalities for FSD specifically.

  • 0x0@programming.dev
    link
    fedilink
    English
    arrow-up
    55
    arrow-down
    4
    ·
    edit-2
    7 days ago

    This is news? Fortnine talked about it two years ago.
    TL;DR Tesla removed LIDAR to save a buck and the cameras see two red dots that the 'puter thinks it’s a far away car at night when indeed it’s a close motorcycle.

    • LesserAbe@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      7 days ago

      It’s helpful to remember that not everyone has seen the same stories you have. If we want something to change, like regulators not allowing dangerous products, then raising public awareness is important. Expressing surprise that not everyone knows about something can be counterproductive.

      Going beyond that, wouldn’t the new information here be the statistics?

      • JordanZ@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        7 days ago

        My state allowed motorcycle filtering in 2019 (not the same as California’s lane splitting). They ran a study and found a ton of motorcyclists were being severely injured or killed while getting rear ended sitting at stop lights. Filtering allows them to move to the front of the traffic light while the light is red and traffic is stationary. Many people are super aggravated about it even though most of the world has been doing it basically forever.

      • bluGill@fedia.io
        link
        fedilink
        arrow-up
        3
        ·
        7 days ago

        like regulators not allowing dangerous products,

        I include human drivers in the list of dangerous products I don’t want allowed. The question is self driving safer overall (despite possible regressions like this). I don’t want regulators to pick favorites. I want them to find “the truth”

        • LesserAbe@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 days ago

          Sure, we’re in agreement as far as that goes. My point was just the commenter above me was indicating it should be common knowledge that Tesla self driving hits motorcycles more than other self driving cars. And whether their comment was about this or some other subject, I think it’s counterproductive to be like “everyone knows that.”

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 days ago

        Why not? It’s got multiple cameras so could judge distances the same way humans do.

        However there have been both hardware and software updates since most of those, so the critical question is how much of a problem is it still? The article had no info or speculation on that

    • EndlessNightmare@reddthat.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 days ago

      The argument is that humans can drive with just 2 eyes, so cameras are enough. I disagree with this position, given that the limitations of a camera-only system. But that’s what it is.

      Different sensors excel at different tasks and different conditions, and cameras are not always it.

  • Ledericas@lemm.ee
    link
    fedilink
    English
    arrow-up
    27
    ·
    6 days ago

    the cybertruck is sharp enough to cut a deer in half, surely a biker is just as vulnerable.

  • SkunkWorkz@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    6 days ago

    It’s because the system has to rely on visual cues, since Tesla’s have no radar. The system looks at the tail light when it’s dark to gauge the distance from the vehicle. And since some bikes have a double light the system thinks it’s a car in front of them that is far away, when in reality it’s a bike up close. Also remember the ai is trained on human driving behavior which Tesla records from their customers. And we all know how well the average human drives around two wheeled vehicles.

  • AnimalsDream@slrpnk.net
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    edit-2
    6 days ago

    I imagine bicyclists must be æffected as well if they’re on the road (as we should be, technically). As somebody who has already been literally inches away from being rear-ended, this makes me never want to bike in the US again.

    Time to go to Netherlands.

    • EndlessNightmare@reddthat.com
      link
      fedilink
      English
      arrow-up
      9
      ·
      6 days ago

      this makes me never want to bike in the US again.

      I live close enough to work for it to be a very reasonable biking distance. But there is no safe route. A high-speed “stroad” with a narrow little bike lane. It would only be a matter of time before some asshole with their face in their phone drifts into me.

      I am deeply resentful of our automobile-centric infrastructure in the U.S. It’s bad for the environment, bad for our wallets, bad for our waistlines, and bad for physical safety.

    • xor@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      7
      ·
      6 days ago

      human driving cars still target bicyclists on purpose so i don’t know see how teslas could be any worse…

      p.s. painting a couple lines on the side of the road does not make a safe bike lane… they need a physical barrier separating the road from them… like how curbs separate the road from sidewalks…

      • AnimalsDream@slrpnk.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 days ago

        I mean yeah, I just said above that someone almost killed me. They were probably a human driver. But that’s a “might happen, never know.” If self driving cars are rear-ending people, that’s an inherent artifact of it’s programming, even though it’s not intentionally programmed to do that.

        So it’s like, things were already bad. I already do not feel safe doing any biking anymore. But as self driving cars become more prevalent, that threat upgrades to a kind of defacto, “Oh, these vast stretches of land are places where only cars and trucks are allowed. Everything else is roadkill waiting to happen.”

  • lnxtx (xe/xem/xyr)@feddit.nl
    link
    fedilink
    English
    arrow-up
    38
    arrow-down
    2
    ·
    7 days ago

    Stop dehumanizing drivers who killed people.
    Feature, wrongly called, Full Self-Driving, shall be supervised at any time.

    • SouthEndSunset@lemm.ee
      link
      fedilink
      English
      arrow-up
      28
      ·
      7 days ago

      If you’re going to say your car has “full self driving”, it should have that, not “full self driving (but needs monitoring.)” or “full self driving (but it disconnects 2 seconds before impact.)”.

    • Ulrich@feddit.org
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      7 days ago

      I think it’s important to call out inattentive drivers while also calling out the systems and false advertising that may lead them to become less attentive.

      If these systems were marketed as “driver assistance systems” instead of “full self driving”, certainly more people would pay attention. The fact that they’ve been allowed to get away with this blatant false advertising is astonishing.

      They’re also obviously not adequately monitoring for driver attentiveness.

  • Redex@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    6 days ago

    Cuz other self driving cars use LIDAR so it’s basically impossible for them to not realise that a bike is there.

    • kameecoding@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      edit-2
      7 days ago

      Because muh freedum, EU are a bunch of commies for not allowing this awesome innovation on their roads

      (I fucking love living in the EU)

    • Not_mikey@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      2
      ·
      7 days ago

      Robots don’t get drunk, or distracted, or text, or speed…

      Anecdotally, I think the Waymos are more courteous than human drivers. Though waymo seems to be the best ones out so far, idk about the other services.

        • dogslayeggs@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          7 days ago

          They have remote drivers that CAN take control in very corner case situations that the software can’t handle. The vast majority of driving is don’t without humans in the loop.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            6 days ago

            They don’t even do that, according to Waymo’s claims.

            They can suggest what the car should do, but they aren’t actually doing it. The car is in complete control.

            Its a nuanced difference, but it is a difference. A Waymo employee never takes control of or operates the vehicle.

            • KayLeadfoot@fedia.ioOP
              link
              fedilink
              arrow-up
              2
              ·
              6 days ago

              Interesting! I did not know that - I assumed the teleoperators took direct control, but that makes much more sense for latency reasons (among others)

              • NotMyOldRedditName@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                6 days ago

                I always just assumed it was their way to ensure the vehicle was really autonomous. If you have someone remotely driving it, you could argue it isn’t actually an AV. Your latency idea makes a lot of sense as well though. Imagine taking over and causing an accident due to latency? This way even if the operator gives a bad suggestion, it was the car that ultimately did it.

    • bluGill@fedia.io
      link
      fedilink
      arrow-up
      9
      ·
      7 days ago

      Humans are terrible drivers. The open question is are self driving cars overall safer than human driven cars. So far the only people talking either don’t have data, or have reason cherry pick only parts of the data that make self driving look good. This is the one exception where someone seemingly independent has done analysis - the question is are they unbiased, or are they cherry picking data to make self driving look bad (I’m not familiar with the source so I can’t answer that)

      Either way more study is needed.

      • KayLeadfoot@fedia.ioOP
        link
        fedilink
        arrow-up
        7
        ·
        7 days ago

        I am absolutely biased. It’s me, I’m the source :)

        I’m a motorcyclist, and I don’t want to die. Also just generally, motorcyclists deserve to get where they are going safely.

        I agree with you. Self-driving cars will overall greatly improve highway safety.

        I disagree with you when you suggest that pointing out flaws in the technology is evidence of bias, or “cherry picking to make self driving look bad.” I think we can improve on the technology by pointing out its systemic defects. If it hits motorcyclists, take it off the road, fix it, and then save lives by putting it back on the road.

        That’s the intention of the coverage, at least: I am hoping to apply pressure to improve rather than remove. Read my Waymo coverage, I’m actually a big automation enthusiast, because fewer crashes is a good thing.

        • bluGill@fedia.io
          link
          fedilink
          arrow-up
          2
          ·
          7 days ago

          I wasn’t trying to suggest that you are biased, only that I have no clue and so it is possible you are somehow unfairly doing something.

          • KayLeadfoot@fedia.ioOP
            link
            fedilink
            arrow-up
            1
            ·
            6 days ago

            Perfectly fair. Sorry, I jumped the gun! Good on you for being incredulous and inspecting the piece for manipulation, that’s smart.

      • Rhaedas@fedia.io
        link
        fedilink
        arrow-up
        6
        ·
        7 days ago

        Humans are terrible. The human eyes and brain are good at detecting certain things though that allow a reaction where computer vision, especially only using one method of detection, fails often. There are times when an automated system will prevent a problem before a human could even see it. So far neither is the clear winner, human driving just has a legacy that automation has to beat by a great length and not just be good enough.

        On the topic of human drivers, I think most on the road drive reactively and not based on prediction and anticipation. Given the speed and possible detection methods, a well designed automated system should be excelling at this. It costs more and it more complex to design such a thing, so we’re getting the bare bones of the best minimum tech can give us right now, which again is not a replacement for all cases.

    • Bytemeister@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      7 days ago

      Because the march of technological advancement is inevitable?

      In light of recent (and let’s face it, long ago cases) Tesla’s “Full Self Driving” needs to be downgraded to level 2 at best.

      Level 2: Partial Automation

      The vehicle can handle both steering and acceleration/deceleration, but the driver must remain engaged and ready to take control.

      Pretty much the same level as other brands self driving feature.

      • AngryCommieKender@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        7 days ago

        The other brands, such as Audi and VW, work much better than Tesla’s system. Their LIDAR systems aren’t blinded by fog, and rain the way the Tesla is. Someone recently tested an Audi with its system against a Tesla with its system. The Tesla failed either 3/5 or 4/5 tests. The Audi passed 3/5 or 4/5. Neither system is perfect, but the one that doesn’t rely on just cameras is clearly superior.

        Edit: it was Mark Rober.

        https://youtu.be/IQJL3htsDyQ

        • Bytemeister@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          7 days ago

          It’s hard to tell, but from about 15 minutes of searching, I was unable to locate any consumer vehicles that include a LIDAR system. Lots of cars include RADAR, for object detection, even multiple RADAR systems for parking. There may be some which includes a TimeOfFlight sensor, which is like LIDAR, but static and lacks the resolution/fidelity. My Mach-E which has level 2 automation uses a combination of computer vision, RADAR and GPS. I was unable to locate a LIDAR sensor for the vehicle.

          The LIDAR system in Mark’s video is quite clearly a pre-production device that is not affiliated with the vehicle manufacturer it was being tested on.

          Adding, after more searching, it looks like the polestar 3, some trim levels of the Audi A8 and the Volvo EX90 include a LiDAR sensor. Curious to see how the consumer grade tech works out in real world.

          Please do not mistake this comment as “AI/computer vision” evangelisim. I currently have a car that uses those technologies for automation, and I would not and do not trust my life or anyone else’s to that system.

          • AngryCommieKender@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            7 days ago

            The way I understand it, is that Audi, Volvo, and VW have had the hardware in place for a few years. They are collecting real world data about how we drive before they allow the systems to be used at all. There are also legal issues with liability.

          • KayLeadfoot@fedia.ioOP
            link
            fedilink
            arrow-up
            2
            ·
            7 days ago

            Mercedes uses LiDAR. They also operate the sole Level 3 driver automation system in the USA. Two models only, the new S-Class and EQS sedans.

            Tesla alleges they’ll be Level 4+ in Austin in 60 days, and just skip Level 3 altogether. We’ll see.

            • Bytemeister@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              7 days ago

              Yeah, keep in mind that Elon couldn’t get level 3 working in a closed, pre-mapped circuit. The robotaxis were just remotely operated.

  • werefreeatlast@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    7 days ago

    Every captcha…can you see the motorcycle? I would be afraid if they wanted all the squares with small babies or maybe just regular folk…can you pick all the hottie’s? Which of these are body parts?