The incident in northern California marked the latest mishap blamed on the electric vehicle company’s Autopilot tech

  • TheEighthDoctor@lemmy.world
    link
    fedilink
    English
    arrow-up
    67
    arrow-down
    1
    ·
    edit-2
    1 month ago

    That’s what I was thinking, your car starts doing something fucking stupid and you just let it?

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 month ago

        Delicately put. But essentially that’s why self-driving cars are not really seen outside of Tesla. Unless the technology is basically perfect there’s essentially no point to it.

        Tesla have it because they use the public as guinea pigs.

        I wouldn’t mind if they all had to go to some dedicated test track to try it out and train it and outside of those environments it wouldn’t turn on. If they want to risk their lives that’s their prerogative, my problem is that it might drive into me one day and I don’t own a Tesla so why should I take that risk?

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      It’s rather reminiscent of the old days of GPS, when people would follow it to the letter, and drive into rivers, go the wrong way up a one-way street, etc.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        There was a legal case recently where somebody drove off a bridge that wasn’t there. At some point you have to take personal responsibility since the outcomes will be extremely personal.