Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. Self-Driving Teslas Are Fatally Striking Motorcyclists More Than Any Other Brand: New Analysis

Self-Driving Teslas Are Fatally Striking Motorcyclists More Than Any Other Brand: New Analysis

Scheduled Pinned Locked Moved Technology
teslafsdautonomyselfdriving
201 Posts 107 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • kayleadfoot@fedia.ioK [email protected]

    In Boca Raton, I've seen no evidence that the self-driving tech was inactive. According to the government, it is reported as a self-driving accident, and according to the driver in his court filings, it was active.

    Insanely, you can slam on the gas in Tesla's self-driving mode, accelerate to 100MPH in a 45MPH zone, and strike another vehicle, all without the vehicle's "traffic aware" automation effectively applying a brake.

    That's not sensationalist. That really is just insanely designed.

    C This user is from outside of this forum
    C This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #110

    FTFA:

    Certain Tesla self-driving technologies are speed capped, but others are not. Simply pressing the accelerator will raise your speed in certain modes, and as we saw in the police filings from the Washington State case, pressing the accelerator also cancels emergency braking.

    That’s how you would strike a motorcyclist at such extreme speed, simply press the accelerator and all other inputs are apparently overridden.

    If the guy smashes the gas, just like in cruise control I would not expect the vehicle to stop itself.

    The guy admitted to being intoxicted and held the gas down... what's the self driving contribution to that?

    kayleadfoot@fedia.ioK 1 Reply Last reply
    0
    • kayleadfoot@fedia.ioK [email protected]

      Mercedes uses LiDAR. They also operate the sole Level 3 driver automation system in the USA. Two models only, the new S-Class and EQS sedans.

      Tesla alleges they'll be Level 4+ in Austin in 60 days, and just skip Level 3 altogether. We'll see.

      B This user is from outside of this forum
      B This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #111

      Yeah, keep in mind that Elon couldn't get level 3 working in a closed, pre-mapped circuit. The robotaxis were just remotely operated.

      1 Reply Last reply
      0
      • C [email protected]

        FTFA:

        Certain Tesla self-driving technologies are speed capped, but others are not. Simply pressing the accelerator will raise your speed in certain modes, and as we saw in the police filings from the Washington State case, pressing the accelerator also cancels emergency braking.

        That’s how you would strike a motorcyclist at such extreme speed, simply press the accelerator and all other inputs are apparently overridden.

        If the guy smashes the gas, just like in cruise control I would not expect the vehicle to stop itself.

        The guy admitted to being intoxicted and held the gas down... what's the self driving contribution to that?

        kayleadfoot@fedia.ioK This user is from outside of this forum
        kayleadfoot@fedia.ioK This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #112

        I know what's in the article, boss. I wrote it. No need to tell me FTFA.

        TACC stands for Traffic Aware Cruise Control. If I have a self-driving technology like TACC active, and the car's sensor suite detects traffic immediately in front of me, I would expect it to reduce speed (as is its advertised function). I would expect that to override gas pedal input, because the gas pedal sets your maximum speed in cruise control, but the software should still function as advertised and not operate at the maximum speed.

        I would not expect it to fail to detect the motorcyclist and plow into them at speed. I think we can all agree that is a bad outcome for a self-driving system.

        Here's the manual, if you're curious. It doesn't work in bright sunlight, fog, excessively curvy roads (???), situations with oncoming headlights (!?!), or if your cameras are dirty or covered with a sticker. They also helpfully specify that "The list above does not represent an exhaustive list of situations that may interfere with proper operation of Traffic-Aware Cruise Control," so it's all that shit, and anything else - if you die or kill somebody, you have just found another situation that may interfere with proper function of the TACC system.

        https://www.tesla.com/ownersmanual/2012_2020_models/en_us/GUID-50331432-B914-400D-B93D-556EAD66FD0B.html#:~:text=Traffic-Aware Cruise Control determines,maintains a set driving speed.

        C 1 Reply Last reply
        0
        • B [email protected]

          like regulators not allowing dangerous products,

          I include human drivers in the list of dangerous products I don't want allowed. The question is self driving safer overall (despite possible regressions like this). I don't want regulators to pick favorites. I want them to find "the truth"

          L This user is from outside of this forum
          L This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #113

          Sure, we're in agreement as far as that goes. My point was just the commenter above me was indicating it should be common knowledge that Tesla self driving hits motorcycles more than other self driving cars. And whether their comment was about this or some other subject, I think it's counterproductive to be like "everyone knows that."

          1 Reply Last reply
          0
          • kayleadfoot@fedia.ioK [email protected]

            I know what's in the article, boss. I wrote it. No need to tell me FTFA.

            TACC stands for Traffic Aware Cruise Control. If I have a self-driving technology like TACC active, and the car's sensor suite detects traffic immediately in front of me, I would expect it to reduce speed (as is its advertised function). I would expect that to override gas pedal input, because the gas pedal sets your maximum speed in cruise control, but the software should still function as advertised and not operate at the maximum speed.

            I would not expect it to fail to detect the motorcyclist and plow into them at speed. I think we can all agree that is a bad outcome for a self-driving system.

            Here's the manual, if you're curious. It doesn't work in bright sunlight, fog, excessively curvy roads (???), situations with oncoming headlights (!?!), or if your cameras are dirty or covered with a sticker. They also helpfully specify that "The list above does not represent an exhaustive list of situations that may interfere with proper operation of Traffic-Aware Cruise Control," so it's all that shit, and anything else - if you die or kill somebody, you have just found another situation that may interfere with proper function of the TACC system.

            https://www.tesla.com/ownersmanual/2012_2020_models/en_us/GUID-50331432-B914-400D-B93D-556EAD66FD0B.html#:~:text=Traffic-Aware Cruise Control determines,maintains a set driving speed.

            C This user is from outside of this forum
            C This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #114

            So do you expect self driving tech to override human action? or do you expect human action to override self driving tech?

            I expect the human to override the system, not the other way around. Nobody claims to have a system that requires no human input, aside from limited and experimental implementations that are not road legal nationwide. I kind of expect human input to override the robot given the fear of robots making mistakes despite the humans behind them getting into them drunk and holding down the throttle until they turn motorcyclists into red mist. But that's my assumption.

            With the boca one specifically, the guy got in his car inebriated. That was the first mistake that caused the problem that should never have happened. If the car was truly self driving automated and had no user input, this wouldn't have happened. It wouldn't have gone nearly 2.5x the speed limit. It would have braked long in advance before hitting someone in the road.

            I have a ninja 650. We all know the danger comes from things we cannot control, such as others. I'd trust an actually automated car over a human driver always, even with limited modern tech. The second the user gets an input though? zero trust.

            1 Reply Last reply
            0
            • L [email protected]

              Good to know, I'll stay away from those damn things when I ride.

              dual_sport_dork@lemmy.worldD This user is from outside of this forum
              dual_sport_dork@lemmy.worldD This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #115

              I already do. Flip a coin: Heads, the car is operating itself and is therefore being operated by a moron. Tails, the owner is driving it manually and therefore it is being operated by a moron.

              Just be sure to carefully watch your six when you're sitting at a stoplight. I've gotten out of the habit of sitting right in the center of the lane, because the odds are getting ever higher that I'll have to scoot out of the way of some imbecile who's coming in hot. That's hard to do when your front tire is 24" away from the license plate of the car in front of you.

              L 1 Reply Last reply
              0
              • dual_sport_dork@lemmy.worldD [email protected]

                I already do. Flip a coin: Heads, the car is operating itself and is therefore being operated by a moron. Tails, the owner is driving it manually and therefore it is being operated by a moron.

                Just be sure to carefully watch your six when you're sitting at a stoplight. I've gotten out of the habit of sitting right in the center of the lane, because the odds are getting ever higher that I'll have to scoot out of the way of some imbecile who's coming in hot. That's hard to do when your front tire is 24" away from the license plate of the car in front of you.

                L This user is from outside of this forum
                L This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #116

                For me it depends which bike I'm riding. If it's my 49cc scooter, I'll sit to the very right side of the lane for a quick escape while watching my mirrors like a hawk. On my XR500, I'll just filter to the front (legal in Utah).

                K 1 Reply Last reply
                0
                • G [email protected]

                  Maybe, if that two-step determination of liability is really what the parent commenter had in mind.

                  I'm not so sure he'd agree with my proposed way of resolving the dispute over liability, which would be to legally require that all self-driving systems (and software running on the car in general) be forced to be Free Software and put it squarely and completely within the control of the vehicle owner.

                  explodicle@sh.itjust.worksE This user is from outside of this forum
                  explodicle@sh.itjust.worksE This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #117

                  I would assume everyone here would agree with that 😘

                  G 1 Reply Last reply
                  0
                  • S [email protected]

                    It’s hardly either / or though. What we have here is empirical data showing that cars without lidar perform worse. So it’s based in empirical results to mandate lidar. You can build a clear, robust requirement around a tech spec. You cannot build a clear, robust law around fatality statistics targets.

                    explodicle@sh.itjust.worksE This user is from outside of this forum
                    explodicle@sh.itjust.worksE This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #118

                    We frequently build clear, robust laws around mandatory testing. Like that recent YouTube video where the Tesla crashed through a wall, but with crash test dummies.

                    S N 2 Replies Last reply
                    0
                    • A [email protected]

                      I imagine bicyclists must be effected as well if they're on the road (as we should be, technically). As somebody who has already been literally inches away from being rear-ended, this makes me never want to bike in the US again.

                      Time to go to Netherlands.

                      X This user is from outside of this forum
                      X This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #119

                      human driving cars still target bicyclists on purpose so i don’t know see how teslas could be any worse…

                      p.s. painting a couple lines on the side of the road does not make a safe bike lane… they need a physical barrier separating the road from them… like how curbs separate the road from sidewalks…

                      A 1 Reply Last reply
                      0
                      • kayleadfoot@fedia.ioK [email protected]

                        TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

                        Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

                        • The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
                        • This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
                        • The crashes are overwhelmingly Teslas rear-ending motorcyclists.

                        Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

                        Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

                        I This user is from outside of this forum
                        I This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #120

                        Remember, you have the right to self-defence, against both rogue robots and rogue humans.

                        ? 1 Reply Last reply
                        0
                        • ? Guest

                          Self driving vehicles should be against the law.

                          I This user is from outside of this forum
                          I This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #121

                          teslas aren't even worthy of the designation "self-driving". They use cheap cameras instead of LIDAR. It should be illegal to call such junk "self-driving".

                          1 Reply Last reply
                          0
                          • A [email protected]

                            That seems like a spectacular oversight. How is it supposed to replicate human vision without depth perception?

                            E This user is from outside of this forum
                            E This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #122

                            The video 0x0 linked to in another comment describes the likely method used to infer distance to objects without a stereoscopic setup, and why it (likely) had issues determining distance in the cases where they hit motorcycles.

                            1 Reply Last reply
                            0
                            • kayleadfoot@fedia.ioK [email protected]

                              TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

                              Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

                              • The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
                              • This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
                              • The crashes are overwhelmingly Teslas rear-ending motorcyclists.

                              Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

                              Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

                              ? Offline
                              ? Offline
                              Guest
                              wrote on last edited by
                              #123

                              Makes sense, statistically smaller sample to be trained on, relatively easy fix, just retrain with more motorcycles in the data.

                              1 Reply Last reply
                              0
                              • I [email protected]

                                Remember, you have the right to self-defence, against both rogue robots and rogue humans.

                                ? Offline
                                ? Offline
                                Guest
                                wrote on last edited by
                                #124

                                How you plan to self defend against a vehicle?

                                K I T 3 Replies Last reply
                                0
                                • N [email protected]

                                  Affectively, does it realy mater if someone has slite misstakes in there righting?

                                  A This user is from outside of this forum
                                  A This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #125

                                  I think i had a stroke reading that. Take your upvote and get out!

                                  N 1 Reply Last reply
                                  0
                                  • ? Guest

                                    How you plan to self defend against a vehicle?

                                    K This user is from outside of this forum
                                    K This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by
                                    #126

                                    Propane cylinder. Mutually assured destruction.

                                    E 1 Reply Last reply
                                    0
                                    • explodicle@sh.itjust.worksE [email protected]

                                      We frequently build clear, robust laws around mandatory testing. Like that recent YouTube video where the Tesla crashed through a wall, but with crash test dummies.

                                      S This user is from outside of this forum
                                      S This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by
                                      #127

                                      Those are ways to gather empirical results, though they rely on artificial, staged situations.

                                      I think it’s fine to have both. Seat belts save lives. I see no problem mandating them. It would not be markedly better

                                      1 Reply Last reply
                                      0
                                      • explodicle@sh.itjust.worksE [email protected]

                                        I would assume everyone here would agree with that 😘

                                        G This user is from outside of this forum
                                        G This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #128

                                        I mean, maybe, but previously when I've said that it's typically gone over like a lead balloon. Even in tech forums, a lot of people have drunk the kool-aid that it's somehow suddenly too dangerous to allow owners to control their property just because software is involved.

                                        ? 1 Reply Last reply
                                        0
                                        • ? Guest

                                          Self driving vehicles should be against the law.

                                          L This user is from outside of this forum
                                          L This user is from outside of this forum
                                          [email protected]
                                          wrote on last edited by
                                          #129

                                          Shouldn't be an issue if drivers used it as a more advanced cruise control. Unless there is catastrophic mechanical or override failure, these things will always be the driver's fault.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups