• Reality Suit@lemmy.one
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    5 months ago

    The company is responsible. Waymo should get the citation. If there were a live driver, the driver would get the citation. If companies want to start going down the route of AI, then whoever is in ownership or responsible for training, should be responsible for the actions of the AI.

    • Flying Squid@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      5 months ago

      Corporations are people until a crime is committed, at which point you can’t punish a corporation for a crime a person commits.

      I don’t understand it, but apparently that’s how it works.

    • FlowVoid@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      Arizona law does allow officers to give out tickets when a robotaxi commits a traffic violation while driving autonomously; however, officers have to give them to the company that owns the vehicle. Doing so is “not feasible,” according to a Phoenix police spokesperson

      • Chozo@fedia.io
        link
        fedilink
        arrow-up
        3
        ·
        5 months ago

        I’m not sure why the police say it’s “not feasible” to issue Google a citation. Google are the registered owners of the vehicles and thus responsible for any actions it performs, just mail them a ticket?

        • FlowVoid@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          5 months ago

          I’m just speculating, but there is probably a very efficient workflow for sending a ticket to an individual (given the number of tickets police write and the revenue they generate), and I wouldn’t be surprised if the workflow doesn’t accommodate an AI operated vehicle. Kind of like how a restaurant would need to restructure its workflow to accommodate DoorDash.

          In other words, “infeasible” might actually mean “would take extra effort”.

          • SlopppyEngineer@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            5 months ago

            I thought the laws in the USA prevented this. It’s why you have manned speed traps because citations must be handed over personally to the driver while other countries have automated speed check systems and send the ticket to the owner of the car, and that can be a leasing company for example.

            • Ferris@infosec.pub
              link
              fedilink
              arrow-up
              1
              ·
              5 months ago

              how about you tape/glue copies of the ticket over the lenses of any exposed cameras and allow Google to figure out the logistics of how to pay the ticket?

  • JeeBaiChow@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    4 months ago

    Isn’t a company at least responsible for the safe operation and training of human drivers? Wouldn’t it be the same for the training of self driving cars?

  • iknowitwheniseeit@lemmynsfw.com
    link
    fedilink
    arrow-up
    2
    ·
    5 months ago

    If a human did this, they would at least get a ticket with a fine, and have the violation recorded on their license which would be revoked if it kept happening. With the computer controlled car, the cop called customer support and was like, “hey you might want to look into it or something.”

    I guess we can’t expect the people hired to protect capital to act against capital, but it’s still a bit disturbing.

    • Chozo@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      I used to work on the software for these cars, so I can speak to this a little. For what it’s worth, I’m no longer with the project, so I have no reason to be sucking Google’s dick if these weren’t my honest opinions on the tech being used here. None of this is to excuse or defend Google, just sharing my insight on how these cars operate based on my experiences with them.

      Waymo’s cars actually do a really good job at self-navigation. Like, sometimes it’s scary how good they actually are when you see the conditions they can operate under. There are so many layers of redundancies that you could lose all of the camera feeds, GPS, and cellular data, and they’ll still be able to navigate themselves through traffic by the LIDAR sensors. Hell, even if you removed the LIDAR from that scenario, those cars accurately know their location based on the last known location combined with how many times each tire has revolved (though it’d just run into everything along the way, but at least it’d know where it’s located the entire time). All of the other sensors and data points collected by the cars actually end up making GPS into the least accurate sensor on the car.

      That said, the article mentions that it was due to “inconsistent construction signage”, which I’d assume to be pretty accurate from my own experience with these cars. Waymo’s cars are usually really good at detecting cone placements and determining where traffic is being rerouted to. But… that’s generally only when the cones are where they’re supposed to be. I’ve seen enough roadwork in Phoenix to know that sometimes Mad Max rules get applied, and even I wouldn’t know how to drive through some of those work zones. It was pretty rare that I’d have to remotely take over an SDC, but 9/10 times I did it was because of construction signs/equipment being in weird places and I’d have to K-turn the car back where it came from.

      That’s not to say that construction consistently causes the cars to get stuck, but I’d say was one of the more common pain points. In theory, if somebody were to run over a cone and nobody picks it back up, an SDC might not interpret that obstruction properly, and can make a dumb decision like going down the wrong lane, under the incorrect assumption that traffic has been temporarily rerouted that way. It sounds scary, and probably looks scary as hell if you saw it on the street, but even then it’s going to stop itself before coming anywhere near an oncoming car, even if it thinks it has right of way, since proximity to other objects will take priority over temporary signage.

      The “driving through a red light” part I’m assuming might actually be inaccurate. Cops do lie, after all. I 100% believe in a Waymo car going down the opposing lane after some sketchy road cones, but I have a hard time buying that it ran a red light, since they will not go if they don’t detect a green light. Passing through an intersection requires a positive detection of a green light; positive or negative detection of red won’t matter, it has to see a green light for its intended lane or it will assume it has to stop at the line.

      In the video, the cop says he turns on his lights and the SDC blows through a red light. While I was working there, red light violations were so rare that literally 100% of the red light violations we received were while a human was driving the car in manual mode. What I’d assume was likely going on is that the SDC was already in a state of “owning” the intersection for an unprotected left turn when the lights came on. When an SDC thinks it’s being pulled over, it’s going to go through its “pullover” process, which first requires exiting an intersection if currently in one. So what likely ended up happening is the SDC was already in the intersection preparing for a left turn, the light turns red while the SDC is in the box (and still legally has right of way to the intersection), cop turns on the sirens, SDC proceeds “forward” through the intersection until it’s able to pull over.

      But, that’s just my speculation based on my somewhat outdated understanding of the software behind these cars. I’d love to see the video of it, but I doubt Waymo will release it unless there’s a lawsuit.

      • Prison Mike@links.hackliberty.org
        link
        fedilink
        arrow-up
        1
        ·
        5 months ago

        The red light bit seems spot on. In every article stating “it blew through a red light” there’s always the caveat that it’s just trying to clear the intersection while getting pulled over. Technically people are allowed to do that (and/or move to a safer area, such as getting into the right lane when being pulled over in the left lane).

        I think media like to add the intersection stuff to rile people up.

  • benji@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    4 months ago

    We’re still so far away from this technology being viable for everyday use, aren’t we?