A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.

I just don’t see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.

  • orca@orcas.enjoying.yachts
    link
    fedilink
    English
    arrow-up
    99
    arrow-down
    3
    ·
    2 months ago

    The worst part is that this problem has already been solved by using LIDAR. Vegas had fully self-driving cars that I saw perform flawlessly, because they were manufactured by a company that doesn’t skimp on tech and rip people off.

      • KayLeadfoot@fedia.ioOP
        link
        fedilink
        arrow-up
        28
        ·
        2 months ago

        Probably Zoox, but conceptually similar, LiDAR backed.

        You can immobilize them by setting anything large on them. Your purse, a traffic cone, a person :)

        Probably makes sense to be a little cautious with the gas pedal when there is an anything on top the vehicle.

      • TheGrandNagus@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 months ago

        A human also (hopefully anyway) wouldn’t drive if you put a cone over their head.

        Like yeah, if you purposely block the car’s vision, it should refuse to drive.

      • ayyy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 months ago

        The same is true when you put a cone in front of a human driver’s vision. I don’t understand why “haha I blocked the vision of a driver and they stopped driving” is a gotcha.

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      2 months ago

      I wouldn’t really called it a solved problem when waymo with lidar is crashing into physical objects

      https://www.msn.com/en-us/autos/news/waymo-recalls-1200-robotaxis-after-cars-crash-into-chains-gates-and-utility-poles/ar-AA1EMVTF

      NHTSA stated that the crashes “involved collisions with clearly visible objects that a competent driver would be expected to avoid.” The agency is continuing its investigation.

      It’d probably be better to say that Lidar is the path to solving these problems, or a tool that can help solve it. But not solved.

      Just because you see a car working perfectly, doesn’t mean it always is working perfectly.

    • FreedomAdvocate@lemmy.net.au
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      6
      ·
      2 months ago

      Lidar doesn’t completely solve the issue lol. Lidar can’t see line markings, speed signs, pedestrian crossings, etc. Cars equipped with lidar crash into things too.

      • orca@orcas.enjoying.yachts
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        2 months ago

        I oversold it in my original comment, but it still performs better than using regular cameras like Tesla did. It performs better in weather and other scenarios than standard cameras. Elon is dumb though and doesn’t think LiDAR is needed for self-driving.

    • kambusha@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      1
      ·
      2 months ago

      Except for the last 0.05 seconds before the crash where the human was put in control. Therefore, the human caused the crash.

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    1
    ·
    2 months ago

    The car made a fatal decision faster than any human could possibly correct it. Tesla’s idea that drivers can “supervise” these systems is, at this point, nothing more than a legal loophole.

    What I don’t get is how this false advertising for years hasn’t caused Tesla bankruptcy already?

      • ayyy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        1
        ·
        2 months ago

        To put your number into perspective, if it only failed 1 time in every hundred miles, it would kill you multiple times a week with the average commute distance.

        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          10
          arrow-down
          2
          ·
          2 months ago

          Someone who doesn’t understand math downvoted you. This is the right framework to understand autonomy, the failure rate needs to be astonishingly low for the product to have any non-negative value. So far, Tesla has not demonstrated non-negative value in a credible way.

          • bluewing@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            3
            ·
            edit-2
            2 months ago

            You are trying to judge the self driving feature in a vacuum. And you can’t do that. You need to compare it to any alternatives. And for automotive travel, the alternative to FSD is to continue to have everyone drive manually. Turns out, most clowns doing that are statistically worse at it than even FSD, (as bad as it is). So, FSD doesn’t need to be perfect-- it just needs to be a bit better than what the average driver can do driving manually. And the last time I saw anything about that, FSD was that “bit better” than you statistically.

            FSD isn’t perfect. No such system will ever be perfect. But, the goal isn’t perfect, it just needs to be better than you.

        • NιƙƙιDιɱҽʂ@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          2 months ago

          …It absolutely fails miserably fairly often and would likely crash that frequently without human intervention, though. Not to the extent here, where there isn’t even time for human intervention, but I frequently had to take over when I used to use it (post v13)

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        2 months ago

        For many years the “supervised” was not included, AFAIK Tesla was forced to do that.
        And in this case “supervised” isn’t even enough, because the car made an abrupt unexpected maneuver, instead of asking the driver to take over in time to react.

        • FreedomAdvocate@lemmy.net.au
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          3
          ·
          2 months ago

          The driver isn’t supposed to wait for the car to tell them to take over lol. The driver is supposed to take over when necessary.

          • SkyezOpen@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            2 months ago

            The attention required to prevent these types of sudden crashes negates the purpose of FSD entirely.

          • Buffalox@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            2 months ago

            No if you look at Waymo as an example, they are actually autonomous, and stop to ask for assistance in situations they are “unsure” how to handle.

            But even if you claim was true, in what way was this a situation where the driver could deem it necessary to take over? It was clear road ahead, and nothing in view to indicate any kind of problem, when the car made a sudden abrupt left causing it to roll upside down.

            • FreedomAdvocate@lemmy.net.au
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              edit-2
              2 months ago

              They can’t stop and ask for assistance at 100km/h on a highway.

              I hope Tesla/Musk address this accident and get the telemetry from the car, cause there’s no evidence that FSD was even on.

              • Buffalox@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                2 months ago

                According to the driver it was on FSD, and it was using the latest software update available.

                https://www.reddit.com/user/SynNightmare/

                They can’t stop and ask for assistance at 100km/h on a highway.

                Maybe the point is then, that Tesla FSD shouldn’t be legally used on a highway.
                But it probably shouldn’t be used anywhere, because it’s faulty as shit.
                And why can’t is slow down to let the driver take over in a timely manner, when it can break for no reason.
                It was tested in Germany on Autobahn where it did that 8 times within 6 hours!!!

                • FreedomAdvocate@lemmy.net.au
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  6
                  ·
                  edit-2
                  2 months ago

                  According to the driver, with zero evidence backing up the claim. With how much of a hard on everyone has for blaming Elon musk for everything, and trying to drag teslas stock down, his accident is a sure fire way to thousands of Internet karma and e-fame on sites like Reddit and Lemmy. Why doesn’t he just show us the interior camera?

                  Looking at his profile he’s milking this for all it’s worth - he’s posted the same thread to like 8 different subs lol. He’s karma whoring. He probably wasn’t even the one involved in the crash.

                  Looked at his twitter which he promoted on there too, and of course he tags mark rober and is retweeting everything about this crash. He’s loving the attention and doing everything he can to get more.

                  Also he had the car for less than 2 weeks and said he used FSD “all the time”……in a brand new car he’d basically never driven……and then it does this catastrophic failure? Yeh nah lol. Also as others in some of the threads have pointed out, the version of FSD he claims it was on wasn’t out at the time of his accident.

                  Dudes lying through his teeth.

  • RandomStickman@fedia.io
    link
    fedilink
    arrow-up
    38
    arrow-down
    2
    ·
    2 months ago

    Anything outside of a freshly painted and paved LA roads at high noon while it’s sunny isn’t ready for self drivings it seems

      • Zwuzelmaus@feddit.org
        link
        fedilink
        English
        arrow-up
        12
        ·
        2 months ago

        Tunnels are extra dangerous. Not because of the likelihood of an accident, but because of the situation if an accident happens. It blocks the tunnels easily, fills it with smoke, and kills hundreds.

        Except newly built tunnels in rich countries.

  • Skyrmir@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    ·
    2 months ago

    I use autopilot all the time on my boat. No way in hell I’d trust it in a car. They all occasionally get suicidal. Mine likes to lull you into a sense of false security, then take a sharp turn into a channel marker or cargo ship at the last second.

    • dependencyinjection@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      14
      ·
      2 months ago

      Exactly. My car doesn’t have AP, but it does have a shed load of sensors and sometimes it just freaks out about stuff being too close to car for no discernible reason. Really freaks me out as I’m like what you see bro we just driving down the motorway.

      • ayyy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 months ago

        For mine, it’s the radar seeing the retro-reflective stripes on utility poles being brighter than it expects.

    • moving to lemme.zip. @lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 months ago

      They have auto pilot on boats? I never even thought about that existing. Makes sense, just never heard of it until just now!

      • JohnEdwa@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 months ago

        They’ve technically had autopilots for over a century, the first one was the oil tanker J.A Moffett in 1920. Though the main purpose of it is to keep the vessel going dead straight as otherwise wind and currents turn it, so using modern car terms I think it would be more accurate to say they have lane assist? Commercial ones can often do waypoint navigation, following a set route on a map, but I don’t think that’s very common on personal vessels.

  • Aku@lemm.ee
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    3
    ·
    2 months ago

    self driving is the future, but im glad im not a beta tester.

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      15
      arrow-down
      1
      ·
      2 months ago

      You’re probably right about the future, but like damn, I wish they would slow their roll and use LiDAR

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        13
        arrow-down
        8
        ·
        2 months ago

        Elon Musk decided they absolutely would not use lidar, years ago when lidar was expensive enough that a decision like that made economic sense to at least try making work. Nowadays lidar is a lot cheaper but for whatever reason Musk has drawn a line in the sand and refuses to back down on it.

        Unlike many people online these days I don’t believe that Musk is some kind of sheer-luck bought-his-way-into-success grifter, he has been genuinely involved in many of the decisions that made his companies grow. But this is one of the downsides of that (Cybertruck is another). He’s forced through ideas that turned out to be amazing, but he’s also forced through ideas that sucked. He seems to be increasingly having trouble distinguishing them.

        • LadyAutumn@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          21
          ·
          2 months ago

          He really hasn’t. He purchased companies that were already sitting on profitable ideas. He is not an engineer. He is not a scientist. He has no training in any design discipline. He takes credit for the ideas of people he pays. He takes credit for the previous achievements of companies he’s purchased.

          What is it going to fucking take for people to finally actually see the grifter for what he is? He’s never had a single good fucking r&d idea in his life 🙃 he has wasted billions of dollars researching and developing absolutely useless ideas that have benefited literally no one and have not made him any money. It is absolutely incredible how powerful his mythos is, that people still believe him to be or have been some kind of engineer or something. He’s a fucking racist nepo baby. He’s never done a single useful thing in his life. He wasn’t the sole individual involved in creating PayPal (and was entirely unrelated in turning it into the successful business it became), he didnt found tesla nor is he responsible for any of the technological developments it made (except for forcing his shitty charger design that notoriously breaks down and charges at half the speed that competitors do), he did not found SpaceX and by all metrics involved has been loathed by everyone at the company for the past decade for continuously committing workers rights violations and fostering a racist sexist and ableist work environment. The man has done nothing but waste people’s time stoking his ego and sexually abusing a slew of employees for the past 2 and a half decades.

        • Buffalox@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          2 months ago

          Musk has drawn a line in the sand and refuses to back down on it.

          From what I heard the upcoming Tesla robotaxi test cars based on model Y are supposed to have LIDAR. But it’s ONLY the robotaxi version that has it.

          He seems to be increasingly having trouble distinguishing them.

          Absolutely, seems to me he has been delusional for years, and it’s getting worse.

    • madcaesar@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      Self driving via cameras IS NOT THE FUTURE!! Cameras are basically slightly better human eyes and human eyes suck ass.

  • sickofit@lemmy.today
    link
    fedilink
    English
    arrow-up
    17
    ·
    2 months ago

    This represents the danger of expecting driver override to avoid accidents. If the driver has to be prepared enough to take control in an accident like this AT ALL TIMES, then the driver is required to be more engaged then they would be if they were just driving manually, because they have to be constantly anticipating not just what other hazards (drivers, pedestrians,…) might be doing, they have to be anticipating in what ways their own vehicle may be trying to kill them.

    • Bytemeister@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 months ago

      Absolutely.

      I’ve got a car with level 2 automation, and after using it for a few months, I can say that it works really well, but you still need to be engaged to drive the car.

      What it is good at… Maintaining lanes, even in tricky situation with poor paint/markings. Maintaining speed and distance from the car in front of you.

      What it is not good at… Tricky traffic, congestion, or sudden stops. Lang changes. Accounting for cars coming up behind you. Avoiding road hazards.

      I use it mostly like an autopilot. The car takes some of the monotonous workload out of driving, which allows me to move my focus from driving the car to observing traffic, other drivers, and road conditions.

  • LadyAutumn@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    6
    ·
    2 months ago

    I am never getting into a self driving car. I don’t understand why we are investing money into this technology when people can already drive cars on their own, and we should be moving towards robust public transportation systems anyway. A waste of time and resources to… what exactly? Stare at your phone for a few extra minutes a day? Work from home and every city having robust electric transit systems is what the future is supposed to be.

    • underline960@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      19
      ·
      2 months ago

      Back when I still believed, I was excited because I wanted get in my car and take a 90-minute nap until I arrived at work.

      With public transportation, you can only be half-asleep or you’ll miss your stop.

      • Beej Jorgensen@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        1
        ·
        2 months ago

        I used to dream of watching a movie then falling asleep in bed while my car drove the 8 hours to my folks’ house.

        But I’d want that beast to be bristling with sensors of every kind. None of this “cameras only” idiocy.

        Someday. Maybe.

      • cestvrai@lemm.ee
        link
        fedilink
        English
        arrow-up
        11
        ·
        2 months ago

        I have a 45 minute high speed train commute to a busy end-of-line station. I can sleep, read, work, or just stare out the window and think.

        Same commute is probably twice as long by car during rush hour.

      • LadyAutumn@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 months ago

        In general I am opposed to machines being in direct control of weapons. I am also definitely of the opinion that there are lots of people who shouldn’t be driving.

    • FreedomAdvocate@lemmy.net.au
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      5
      ·
      2 months ago

      People crash cars far, far, far more than Tesla FSD crashes “per capita”. People are terrible drivers on average.

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      2 months ago

      Ditto! They were about 1 foot from hitting the tree head on rather than glancing off, could have easily been fatal. Weirdly small axises of random chance that the world spins on

      • Corkyskog@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        9
        ·
        2 months ago

        I still don’t understand what made it happen. I kept watching shadows and expecting it to happen earlier.

        • Phen@lemmy.eco.br
          link
          fedilink
          English
          arrow-up
          4
          ·
          2 months ago

          There’s some difference in the fences on the left side at the exact time the car passed by on the other lane. My guess is that the timing of the other car made the software interpret those changes in the input as something moving instead of simply something being different.

        • IllNess@infosec.pub
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 months ago

          I thought it might be following the tire tracks but no. It just decided to veer completely off.

  • melsaskca@lemmy.ca
    link
    fedilink
    English
    arrow-up
    13
    ·
    2 months ago

    I have visions of Elon sitting in his lair, stroking his cat, and using his laptop to cause this crash. /s

  • atmorous@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    2 months ago

    For no reason?

    They are running proprietary software in the car that people don’t even know what is happening in background of. Every electric car needs to be turned into an open source car so that the car cannot be tampered with, no surveillancing, etc etc

    Everyone should advocate for that because the alternative is this with Tesla. And I know nobody wants this happening to other car manufacturers cars as well

  • itisileclerk@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    2 months ago

    Why someone will be a passenger in self-driving vehicle? They know that they are a test subjects, part of a “Cartrial” (or whatever should be called)? Self-Driving is not reliable and not necessery. Too much money is invested in something that is “Low priority to have”. There are prefectly fast and saf self-driving solutions like High-speed Trains.

    • dan1101@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      I have no idea, I guess they have a lot more confidence in self driving (ESPECIALLY Tesla) than I do.