A driverless car in San Francisco drove right into wet concrete and got stuck after seemingly mistaking it for a regular road: ‘It ain’t got a brain’ / The site had been marked off with constructio…::The site had been marked off with construction cones and workers stood with flags at each end of the block, according to city officials.

  • theluddite@lemmy.ml
    link
    fedilink
    English
    arrow-up
    51
    arrow-down
    12
    ·
    edit-2
    1 year ago

    Every time one of these things happens, there’s always comments here about how humans do these things too. Two responses to that:

    First, human drivers are actually really good at driving. Here’s Cory Doctorow explaining this point:

    Take the much-vaunted terribleness of human drivers, which the AV industry likes to tout. It’s true that the other dumdums on the road cutting you off and changing lanes without their turn-signals are pretty bad drivers, but actual, professional drivers are amazing. The average school-bus driver clocks up 500 million miles without a fatal crash (but of course, bus drivers are part of the public transit system).

    Even dopes like you and me are better than you may think – while cars do kill the shit out of Americans, it’s because Americans drive so goddamned much. US traffic deaths are a mere one per 100 million miles driven, and most of those deaths are due to recklessness, not inability. Drunks, speeders, texters and sleepy drivers cause traffic fatalities – they may be skilled drivers, but they are also reckless.

    There’s like a few hundred robot taxis driving relatively few miles, and the problems are constant. I don’t know of anyone who has plugged the numbers yet, but I suspect they look pretty bad by comparison.

    Second, when self-driving cars fuck up, they become everyone else’s problem. Emergency service personnel, paid for by the taxpayer, are suddenly stuck having to call corporate customer service or whatever. When a human fucks up, there’s also a human on the scene to take responsibility for the situation and figure out how to remedy it (unless it’s a terrible accident and they’re disabled or something, but that’s an edge case). When one of these robot taxis fucks up, it becomes the problem of whoever they’re inconveniencing, be it construction workers, firefighters, police, whatever.

    This second point is classic corporate behavior. Companies look for ways to convert their internal costs (in this case, the labor of taxi drivers) into externalities, pushing down their costs but leaving the rest of us to deal with their mess. For example, plastic packaging is much, much cheaper for companies than collecting and reusing glass bottles or whatever, but the trash now becomes everyone else’s problem, and at this point, there is microplastic in literally every place on Earth.

    • joe@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      5
      ·
      edit-2
      1 year ago

      I’m not sure your second point is as strong as you believe it to be. Do you have a specific example in mind? I think most vehicle problems that would require an emergency responder will have easy access to a tow service to deal with the car with or without a human being involved. It’s not like just because a human is there that the problem is more easily solved. For minor-to-moderate accidents that just require a police report, things might get messy but that’s an issue with the law, not necessarily something inherently wrong with the concept of self driving vehicles.

      Also, your first point is on shaky ground, I think. I don’t know why the metric is accidents with fatalities, but since that’s what you used, what do you think having fewer humans involved does to the chance of killing a human?

      I’m all for numbers being crunched, and to be clear (as you were, I think) the numbers are the real deciding metrics here, not thought experiments.

      And I think it’s 100% true that autonomous transportation doesn’t have to be perfect, just better than humans. Not that you disagree with this, but it is probably what people are thinking when they say “humans do this too”.

      • theluddite@lemmy.ml
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        1 year ago

        I’m not sure your second point is as strong as you believe it to be. Do you have a specific example in mind? I think most vehicle problems that would require an emergency responder will have easy access to a tow service to deal with the car with or without a human being involved. It’s not like just because a human is there that the problem is more easily solved. For minor-to-moderate accidents that just require a police report, things might get messy but that’s an issue with the law, not necessarily something inherently wrong with the concept of self driving vehicles.

        https://missionlocal.org/2023/08/cruise-waymo-autonomous-vehicle-robot-taxi-driverless-car-reports-san-francisco/

        The fire department in SF has made it very clear that these cars are a PITA for them. They are actively driving through emergency situations, cannot follow verbal instructions, drive over fire hoses, etc.

        Also, your first point is on shaky ground, I think. I don’t know why the metric is accidents with fatalities,

        Fatalities is just the number we have to compare. Self-driving car companies have been publishing a simulated fatality metric for a while now. I totally agree there are other ways to think about it. My point is that AV companies have a narrative that humans are actually bad at driving, and I think this comparison pokes a hole in that story.

        but since that’s what you used, what do you think having fewer humans involved does to the chance of killing a human?

        I’m not sure, actually. The vast majority of driving is solo trips, so I’d expect not that much? There are some studies suggesting that people might actually use cars more if self-driving cars become a reality:

        https://www.wired.com/story/driving-partially-automated-people-drive-more/

        And that really gets to the heart of my problem with the self-driving cars push. When faced with complex problems, we should not assume there is a technological solution. Instead, we should ask ourselves to envision a better world, and then decide what technologies, if any, we need to get there. If self-driving cars are actually a good solution to the problem, then by all means, let’s make them happen.

        But I don’t think that’s what’s happening here, and I don’t think they are. American cities are a fucking disaster of planning. They are genuinely shameful, forcing their inhabitants to rely on cars, an excessively wasteful mode of transportation, all in a climate crisis. Instead of coming together to work on this problem, we’re begging our technological overlords to solve them for us, with an added drawback of privatizing our public infrastructure.

      • meco03211@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        3
        ·
        1 year ago

        Also, almost all safety numbers for transportation are meaningless unless normalized to miles driven. They also commented about these issues being “everywhere” then goes on a long diatribe against self driving cars. I rarely see anything about them likely based moreso on the media I consume. They clearly have a bias and the media they consume has likely been tailored to support that. Them seeing many articles on crashes or accidents is anecdotal at best (as it is with me having not seen many articles).

        • theluddite@lemmy.ml
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          1 year ago

          I’m not sure what you mean? All the numbers I used are explicitly normalized by or discussed in the context of distance driven. My comment contains the phrase “miles driven” several times. Docotorws piece that I quote from goes into more detail, again normalized by miles driven.

          https://pluralistic.net/2022/10/09/herbies-revenge/

          • meco03211@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            4
            ·
            1 year ago

            Every time one of these things happens, there’s always comments here about how humans do these things too. Two responses to that:

            First, human drivers are actually really good at driving. Here’s Cory Doctorow explaining this point:

            Just saying “humans are good” is a flat statement with no impact. They would need to be better than self driving cars for that to mean anything. The reason this is always pointed out when news pops up of a self driving car having an accident like this, is because those stories don’t make headlines for someone like you to use as an anecdote.

            There’s like a few hundred robot taxis driving relatively few miles, and the problems are constant.

            This is where you didn’t normalize to miles. Amplified by the next sentence…

            I don’t know of anyone who has plugged the numbers yet, but I suspect they look pretty bad by comparison.

            You don’t know the numbers. You just feel strongly about it. That’s not evidence.

            • theluddite@lemmy.ml
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              edit-2
              1 year ago

              Like I said, everything is normalized by miles or discussed inbthe context of distance driven.

              We don’t have concrete numbers for the real world cars, but we absolutely have enough to make educated estimates, and those line up with the existing data.

              In a few months, the cars had some 55 incidents with emergency services. iirc there were only a couple hundred cars. There are millions upon millions of cars in San Francisco driving orders of magnitude more miles than that, and the emergency services personnel are actively flagging the self driving cars as a serious problem.

              I’d obviously prefer to have better real world data. The data that we do have is consistent in showing self driving cars significantly underperform compared to humans per mile driven by several orders of magnitude, as Doctorow mentioned in that piece, and I quoted. That data that does exist is also consistent with the emerging picture, albeit the numbers for that aren’t in yet.

              Afaik, there isn’t a single piece of data in existence in favor of self driving cars, but there is plenty against. If you have something to the contrary, lmk, because that would greatly change my opinion. I fucking want a self driving car. They sound rad as hell. But I don’t want to organize our entire society around more big tech vaporware.

    • maporita@unilem.org
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      9
      ·
      1 year ago

      There’s another difference between humans and computers you forgot to mention. Once a computer 'learns" something, (like avoiding driving into wet concrete), it will never make that mistake again. Prople on the other hand continue making the same error over and over.

      You are using an argument that is not new … pilots have used it for decades (and some still do) to complain about automation on the flight deck. Yet every day tens of thousands of airliners fly to their destination (and sometimes land there as well) with no pilot intervention. Pilots could easily be eliminated from airplanes … the reason they are still flying has more to do with PR and a public not willing to fly without a human up front. But automation has made air travel safer by an order of magnitude. It will do the same for cars.

      • Jagger2097@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        1
        ·
        edit-2
        1 year ago

        Planes fly with significant distance between them, well above any major obstacles, along routes with very few turns. Cars on the other hand are close together, traveling along poorly marked routes that have significant amounts of turns, and need to dodge a lot of obstacles. It’s quite rare for a plane to hit a cat.

        • Magnergy@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          Planes are serviced by teams on a regular basis. Regulations and double checking galore. There are trailers on the road at this moment being held together with welded closed vise grips.

          • anlumo@feddit.de
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            I remember that one episode of Pimp My Ride back in the 90s where the guy had a Frankenstein car that consisted of two half-cars welded together in the middle…

      • theluddite@lemmy.ml
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        3
        ·
        1 year ago

        That is such a massive oversimplification of how computer learning works that it’s neither here nor there.

        Also, automation might work in some cases and not others. Sometimes different things are similar, and sometimes they’re different. Just because similar arguments have been made before about different things doesn’t mean you get to discount them now in an different situation.

      • AfricanExpansionist@lemmy.ml
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        1 year ago

        Airplanes don’t have to interact with hundreds of other vehicles all constantly changing speeds and course headings

    • Bjornir@programming.dev
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      5
      ·
      1 year ago

      I’m not sure I see the point, AV will end up being better than humans if they are not already for the cutting edge tech. There is nothing a human does a computer can’t do faster. An AV has 360° view and can react in a matter of one millisecond, while being just as precise as if it had all the time in the world to do the maneuver.

      Plus, and this is the part I truly don’t understand with the quote : a computer can’t get drunk, sleepy or get distracted. So by nature it fixes what the quote considers to be the biggest problem of human drivers.

    • iopq@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      6
      ·
      1 year ago

      It’s a software update away from getting better. Humans will be forever a risk to other humans when driving. I’m not saying it’s good yet, but people in 2020 thought “driverless cars will be forever 5 years away”

      Yet here we are, talking about how bad they are. That’s an improvement from only limited testing a few short years ago