New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

  • hoodlem@hoodlem.me
    link
    fedilink
    English
    arrow-up
    87
    ·
    2 years ago

    In fact, by the time the crash happens, it’s alerted the driver to pay more attention no less than 150 times over the course of about 45 minutes. Nevertheless, the system didn’t recognize a lack of engagement to the point that it shut down Autopilot

    I blame the driver, but if the above is true there was a problem with the Tesla as well. The Tesla is intended to disengage and disable autopilot for the remainder of the drive after a small number of ignored alerts. If the car didn’t do that, there’s a bug in the Tesla software.

    I think it’s more likely the driver used a trick to make the car think he was engaged when he was not. You can do things like put a water bottle wedged in the steering wheel to make the car think you have tugged on the steering wheel to prove you are engaged. (Don’t ask me how I know)

    • NeoNachtwaechter@lemmy.world
      link
      fedilink
      English
      arrow-up
      63
      ·
      2 years ago

      The Tesla is intended to disengage and disable autopilot

      What about: slow down, pull up to the right, stop the car, THEN disengage?

    • RushingSquirrel@lemm.ee
      link
      fedilink
      English
      arrow-up
      20
      ·
      2 years ago

      After 3 alerts, it’s off until you park. There are visual cues that precede the alert though and these do not count. I don’t recall how many there are and for how long, but you start by seeing a message asking to have your hands on the wheel, then a blue line at the top, them the line starts pulsing ,then you’ve got an audio alert that is the first strike. Three strikes during the same drive and you need to park before using autopilot again.

  • zerbey@lemmy.world
    link
    fedilink
    English
    arrow-up
    47
    arrow-down
    4
    ·
    2 years ago

    150 more warnings than a regular car would give, ultimately it’s the driver’s fault.

  • N3Cr0@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    ·
    2 years ago

    Poor drunk impaired driver falling victim to autonomous driving… Hopefully that driver lost their license.

  • Jeena@jemmy.jeena.net
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    1
    ·
    2 years ago

    So if the guy behind the wheel died and couldn’t react to the alerts then the car can’t do a decision to just stop instead of crashing into a police car?

  • thatKamGuy@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    25
    ·
    2 years ago

    Driver is definitely the one ultimately at fault here, but how is it that Tesla doesn’t perform an emergency stop in this situation - but just barrels into an obstacle?

    Even my relatively ‘dumb’ car with adaptive cruise control handles this type of situation better than Tesla?!

    • RushingSquirrel@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      2 years ago

      I believe this is caused by the fog combined with flashing lights and upward/curved road. The Tesla autopilot system is super impressive in almost all situations but you can clearly see the limits in extreme situations. Here, the drunk driver is definitely at fault, I don’t understand why they’d sue Tesla.

    • NeoNachtwaechter@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 years ago

      Even my relatively ‘dumb’ car […] handles […] better than Tesla?!

      Not going to be the last time when you experience that :-)

  • Md1501@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    2 years ago

    You know what might work, program the car so that after the second unanswered “alert” the autopilot pulls the car over, or reduces speed and turns on the hazards. The third violation of this auto pilot is disabled for that car for a period of time.

    • Technoguyfication@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      This is literally exactly how it works already. The driver must have been pulling on the steering wheel right before it gave him a strike. The system will warn you to pay attention for a few seconds before shutting down. Here’s a video: https://youtu.be/oBIKikBmdN8

  • r00ty@kbin.life
    link
    fedilink
    arrow-up
    9
    arrow-down
    1
    ·
    2 years ago

    I’m not so sure disengaging autopilot because the driver’s hands were not on the wheel while on a highway, is the best option. Engage hazard lights, remain in lane (or if able move to the slowest lane) and come to a stop. Surely that’s the better way?

    Just disengaging the autopilot seems like such a copout to me. Also the fact it disengaged right at the end “The driver was in control at the moment of the crash” just again feels like bad “self” driving. Especially when the so-called self-driving is able to come to a stop as part of its software in other situations.

    Also if you cannot recognize an emergency vehicle (I wonder if this was a combination of the haze and the usually bright emergency lights saturating the image it was trying to analyse) it’s again a sign you shouldn’t be releasing this to the public. It’s clearly just not ready.

    Not taking any responsibility away from the human driver here. I just don’t think the behaviour was good enough for software controlling a car used by the public.

    Not to mention, of course, the reason for suing Tesla isn’t because they think they’re more liable. It’s because they can actually get some money from them.

      • NeoNachtwaechter@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 years ago

        That’s not the main problem. It is more like an excuse. The main problem has been explained in the video right before that:

        Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!

        The emergency vehicles just happen to be your most frequent kind of obstacles.

        The fallback to the camera is a bad excuse anyway, because radar is needed first to detect any obstacles. The cam will usually be later (=at closer distance) than the radar.

        The even better solution (Trigger warning: nerdy stuff incoming) is to always mix all results of all kinds of sensors at an early stage in the processing software. That’s what european car makers do right from the beginning, but Tesla is way behind with their engineering. Their sensors still work indepently, and each does their own processing. So every shortcoming of one sensor creates a faulty detection result that has to be covered later (read: seconds later, not milliseconds) by other kinds of sensors.

        • r00ty@kbin.life
          link
          fedilink
          arrow-up
          1
          ·
          2 years ago

          Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!

          I feel like this is bad tech understanding in journalism (which is hardly new). There’s no reason radar couldn’t see stationary vehicles. In fact, very specifically, they’re NOT stationary relative to the radar transceiver. Radar would see them no problem.

          My actual suspicion here is that Tesla actively ignores stationary vehicles (it can know they’re stationary by adding its known speed to the relative speed) not in front of the vehicle. Now, in normal streets this makes sense (or at least those on the non-driver’s side). Do you pay attention to every car parked by the side of the road when driving? You’re maybe looking for signs of movement, or lights on, etc. But you’re not tracking them all, and neither will the autopilot. However, on a highway if you have more than 1 vehicle on the shoulder every now and then it should be making you wonder what else is ahead (and I’d argue a single car on the shoulder is a risk to keep watch on). A long line of them should definitely make you slow down.

          I think Human drivers would do this, and I think an autopilot should be considering what kind of road it is on, and whether it should treat scenarios different.

          I also have another suspicion, but it’s just a thought. If this Tesla was really using radar as well as cameras, haze or not, it should have seen that stationary vehicle further ahead than it did. Since newer Tesla cars don’t have radar, and coming from a software development background, I can actually see a logical (in terms of corporate thinking) reason to remove the code for radar. They would do this simply because they will not want to maintain it if they have no plans to return to radar. Think of it like this. After a few versions of augmenting the camera detection logic, it is unlikely to work with the existing radar logic. Do they spend the time to make them work together for the older vehicles, or only allow camera based AI on newer software versions? I would suspect the latter would be the business decision.

      • r00ty@kbin.life
        link
        fedilink
        arrow-up
        1
        ·
        2 years ago

        The question here is, could you see there was a reason to stop the car significantly (more than 3 seconds) before the autopilot did? If we can recognize it through the haze the autopilot must too.

        Moreover, it needs to now be extra good at spotting vehicles in bad lighting conditions because other sensors are removed on newer Teslas. It only has cameras to go on.

  • Jordan Lund@lemmy.one
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    6
    ·
    2 years ago

    Don’t see how that’s a Tesla problem… Drunk/high driver operating their car incorrectly.

          • Jordan Lund@lemmy.one
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            2 years ago

            Autopilot doesn’t work that way, the drunk should have known that when he wasn’t drunk and not tried to use it that way.

            It’s like the old shaggy dog story about the guy driving a camper, setting the cruise control, then going into the back to make lunch.

            That’s not the fault of the cruise control.

  • Peanut@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    2 years ago

    i still think tesla did a poor job in conveying the limitations on the larger scale. they piggybacked waymo’s capability and practice without matching it, which is probably why so many are over reliant. i’ve always been against mass-producing semi-autonomous vehicles to the general public. this is why.

    and then this garbage is used to attack the general concept of autonomous vehicles, which may become a fantastic life-saver, because then it can safely drive these assholes around.

    • Thorny_Thicket@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      13
      ·
      edit-2
      2 years ago

      Tesla on autopilot/FSD is almost 4 times less likely to be involved in a crash than a human driven Tesla which even then is half as likely to end up in a accident compared to average car. You not liking Musk fortunelately doesn’t change these facts.

      In the 2nd quarter, we recorded one crash for every 4.41 million miles driven in which drivers were using Autopilot technology (Autosteer and active safety features). For drivers who were not using Autopilot technology (no Autosteer and active safety features), we recorded one crash for every 1.2 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.

      Source

      • tiny_electron@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        19
        arrow-down
        1
        ·
        2 years ago

        There is a bias here in the numbers. Teslas are expensive and not everyone is buying them. The lower accident rate can be explained by the different demographic driving the vehicle rather than Teslas being better. For exemple, younger people might be more likely to cause accident because of different factors and they are also less likely to buy a Tesla because they are so expensive. I dont have the numbers for this, but we should all be careful with the claims of Tesla on safety when they compared themself to the global average.

        • Thorny_Thicket@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          9
          ·
          2 years ago

          Sure. There are always multiple factors in play. However I’d still be willing to bet that there’s nothing in Teslas that makes them inherently unsafe compared to other cars.

        • Thorny_Thicket@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          10
          ·
          2 years ago

          Perhaps. I’m sure you’ll provide me with the independent data you’re basing that “Teslas are not safe” claim on

            • narp@feddit.de
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              2
              ·
              2 years ago

              You made the first comment: “Teslas aren’t safe”, without providing proof.

              And now you’re calling someone a hypocrite because he asks for data of exactly what you claimed, while you’re redefining your first argument as “the contrary”.

              So, do you have proof that Tesla’s aren’t safe in comparison to other cars, or is it just your opinion?

              • masterairmagic@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                0
                arrow-down
                2
                ·
                2 years ago

                We’re literally having this discussion under a video where automatic braking should have kicked in, but didn’t.

                • narp@feddit.de
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  2 years ago

                  But you can’t base a fact on one accident. Or even multiple. What if newspapers like to write especially about Tesla accidents to generate clicks?

                  Teslas seemingly have a lot of accidents, but without checking the statistics and comparing it to other manufacturers you wouldn’t really know if the perceived truth is a fact or not.

            • Thorny_Thicket@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              4
              ·
              edit-2
              2 years ago

              Tesla model Y scored the highest possible score on IIHS crash test as well as 5 stars on Euro NCAP

              Their other models have similar results. I believe Model X is the safest SUV ever made.

              EDIT:

              More than just resulting in a 5-star rating, the data from NHTSA’s testing shows that Model X has the lowest probability of injury of any SUV it has ever tested," Tesla said in a statement. "In fact, of all the cars NHTSA has ever tested, Model X’s overall probability of injury was second only to Model S.

              Source

                • Thorny_Thicket@sopuli.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  3
                  ·
                  2 years ago

                  Or maybe you’re so blinded by the hatred towards Musk that you can’t even think straight and no evidence in the world could convince you otherwise?

                  You really should’ve checked the last link.

      • NeoNachtwaechter@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        2 years ago

        almost 4 times less likely to be involved in a crash than a human driven

        Not relevant at all here, when we are discussing occurences that seem so easily and obviously avoidable.

        (But it’s nice to see that the Fanboi team is awake now)

  • pizzaiolo@slrpnk.net
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    2 years ago

    It’s what you get when you design places that require cars for everything