The autopilot will turn off just before hitting them to make you liable anyway
That’s only if you didn’t subscribe to the Ludicrous package.
Nah even then. Ain’t no way Tesla admits fault for anything
Until they go the way of PayPal, at least. Musk’s exit plan is Mars, remember?
Can we please speed up his exit plan?
PayPal sold for a billion bucks, the largest sale ever, at the time. Now it’s just integrated into eBay, which also isn’t going anywhere, so I have no idea what you’re implying. Did I miss something?
eBay and PayPal broke off 9 years ago btw.
Autopilot turns off because the car doesn’t know what to do and the driver is supposed to take control of the situation. The autopilot isn’t autopilot, it’s driving assistance and you want it to turn off if it doesn’t know what it’s should do.
Autopilot also turns off on planes when things go wrong.
Sure, what meant though was that Tesla doesn’t have self driving cars the way they try to market it as. They are no different than what other car manufacturers got, they just use a more deceptive name.
Autopilot turns off before collision because physical damage can cause unpredictable effects that could cause another accident.
Let’s say you run into a wall, autopilot is broken, the car thinks it needs to go backwards. You now killed 3 more people.
I hate Elon Musk and Teslas are bad, but let’s not spread misinformation.
It actually does. Teslas are great.
This reminds me of that Chinese law about being personally responsible for all medical debts of a person you run over—incentivizing killing the person, rather than injuring them.
That’s been revised…right?
Even with autopilot I feel it’s unlikely that driver would not be liable. We didn’t have a case yet but once this happens and goes higher to courts it’ll immediatly establish a liability precedence.
Some interesting headlines:
- a fatal crash where the driver claims his Tesla was on autopilot when it fatally struck a motorcyclist. (ongoing)
- ‘Autopilot’ hit-run driver sentenced to nine months (australia, claims that autopilot was on but it’s unclear)
So I’m pretty sure that autopilot drivers would be found liable very fast if this developed further.
I am not a lawyer.
I think an argument can be made that a moving vehicle is no different than a lethal weapon, and the autopilot, nothing more than a safety mechanism on said weapon. Which is to say the person in the driver’s seat is responsible for the safe operation of that device at all times, in all but the most compromised of circumstances (e.g. unconscious, heart attack, taken hostage, etc.).
Ruling otherwise would open up a transportation hellscape where violent acts are simply passed off to insurance and manufacturer as a bill. No doubt those parties would rush to close that window, but it would be open for a time.
Cynically, a corrupt government in bed with big monied interests would never allow the common man to have this much power to commit violence. Especially at their expense, fiscal or otherwise.
So just or unjust, I think we can expect the gavel to swing in favor of pushing all liability to the driver.
They’re most likely liable. “FSD” is not full self driving, it’s still a test product, and I guarantee the conditions for using it include paying attention and keeping your hands on the wheel. The legal team at tesla definitely made sure they weren’t on the hook.
Now where there might be a case for liability is Elon and his stupid Twitter posts and false claims about FSD. Many people have been mislead and it’s probably contributed to a few of the autopilot crashes.
You’re still in control of the vehicle, therefore you’re still liable. Like plopping a 5 year old on your lap to drive while you nap, if they hit people it’s still your fault for handing over the control to something incapable of driving safely while you were responsible for the vehicle.
But a reasonable person would not consider a child capable of driving. An “extremeley advanced algorithm that is better and safer than humans and everyone should use it” is very different in this case. Aftet hearing all the stupid fluff, it is not unreasonable to think that selfdrivong is good.
I’m not aware of a single jurisdiction on the planet that makes Tesla liable for what the vehicle does when autopilot is enabled. In order to activate autopilot you have to accept about 3 different disclaimers on the car’s screen that state VERY clearly how you are still responsible for the vehicle and you must intervene if it starts behaving dangerously.
I’ve been driving with autopilot for over 2 years, and while it has done some stupid stuff before (taking wrong turns, getting in the wrong lane, etc.), it has NEVER come close to hitting another vehicle or person. Any time something out of the ordinary happens, I disengage autopilot and take over.
Condolences on owning a tesla
Bro bought a Tesla just 2 years ago. Long after it was very widely known just how much of an arsehole Musk was, and after many other excellent EVs were on the market.
I’ll let you draw the conclusions from those facts.
When I bought my car, there were no widespread plans for other manufactures to adopt NACS, you couldn’t get your hands on a Rivian for less than $100k, and I was commonly driving long distances for work so I needed a vehicle with long range that I could charge quickly on trips. Tesla checked all the boxes.
I haven’t experienced any of these super widespread quality or reliability issues people on the internet talk about. It was delivered with no issues, has needed very little maintenance (just tire rotations basically), and it’s not falling apart like some would lead you to believe. I don’t know what to say other than that my personal experience with the vehicle has been great, and that’s what I really care about in a vehicle. I don’t buy cars based off what the CEO says on Twitter.
Hate Musk or not, the Tesla is still a very good car. In many markets still the better value often times.
Yeah and while Elon is the fucking worst I assume not everyone knows that he is the Tesla man. It’s incredible actually how much he’s intertwined with the brand. I would totally buy a Toyota or whatever and I couldn’t tell you the name of their CEO, nor of any other car manufacturer, nor would I look up who they are beforehand.
Granted the poster above is on Lemmy so I assume he knows more about musky boy than he would like.
I have a Ford too and couldn’t even tell you who the CEO of Ford is. Teslas are great daily drivers, I don’t care what the CEO does or says online.
his username is technoguyfication, either it’s a troll account or he is rolling with the technobro moniker
I’ve had this username since I was 11 years old, you don’t need to read that deeply into it haha
Everything I’ve heard says that Teslas have had huge reliability problems.
Haven’t experienced any myself. I’m just a single data point, but my car has been nothing but reliable from day one. It’s a great daily driver.
These days not really. I’m gonna get downvoted to oblivion obviously because this is Lemmy, but generally the cars are more than fine these days
Unless you forget to put them in car wash mode, or it happens to combust while you’re driving
🙄
for context, do you own a tesla and if yes, what other car have you owned?
You can think whatever you want, but my experience driving it has been perfectly fine. Range is great, the car is not falling apart like some people claim, it was not delivered with any issues, and chargers are plentiful where I live. Those are the main things I (any many others) care about in a vehicle. I don’t care what the CEO does or says online. I have a Ford as well and couldn’t even tell you who the CEO of Ford is.
If someone is injured or killed by a Tesla car, they can sue the company directly, regardless of any legal agreements you may have as the owner. Whether they win is a different question, but they might win if they could show that Tesla was negligent, and especially if Tesla was willfully negligent.
Just because you think you’re responsible, even if you agreed in triplicate that you’re responsible, doesn’t necessarily make you legally responsible, depending on the circumstances. And that’s the way it should be.
The funny part will be one the car doesn’t have a driver and is full autonomous. If the car kills someone, who’s to blame?
The person for getting in the way, obviously
The company that rented it to you, because fully self-driving cars won’t be for private ownership, they’ll just replace rideshare drivers.
Whichever was at fault is my non-lawyer opinion.
What kind of penalty you apply to a self driving car guilty for causing an accident is a good question though.
You will be liable either way. If you don’t do anything, you broke the terms of not being attentive enough.
Immagino having a car that doesn’t pretend to drive herself but it’s enjoyable to drive, a car that doesn’t pretend to be a fucking movie because it’s just a car, a car without two thousands different policies to accept in wich you will never know what’s written but a car that you will be able to drive even though you decided to wear a red shirt on a Thursday morning which in you distorted future society is a political insult to some shithead CEO, a car that you own not a subscription based loan ,a car that keeps very slowly polluting the environment instead of polluting it with heavy chemicals dig up from childrens while still managing to pollute in CO2 exactly the same as the next 20 years of the slow polluting one not to mention where the current comes from, a car that will run forever if you treat it well and with minor fixes with relative minor environment impact and doesn’t need periodic battery replacement which btw is like building a new vehicle … This are not only a critical thoughts about green washing but are meant to make you reflect on the different meanings of ownership in different time periods
And yes I will always think that all environmentalists that absolutely needs a car should drive a 1990s car, fix it, save it from the dump fields and drive it till it crashes into a wall …
I would expect that that 90’s car would eventually be able to be converted to hydrogen combustion. That would save on pumping up petrol (if the hydrogen is not generated with petrol) and it would not cost yet another car to be created.
Press the brake.
WRONG!!!
Hard braking may increase your insurance costs: https://www.nytimes.com/2024/03/11/technology/carmakers-driver-tracking-insurance.html
TL;DR: General Motors was selling customer driving data to LexisNexis which provided them to insurance companies. Hard braking also contributed to a higher risk factor.
Nah bro if it’s the choice between raising insurance cost vs killing people + jail time for manslaughter + eating the guilt for the rest of my life, i’ll take the insurance.
Also wth america your capitalism and your priority is wack.
They were joking…?
I don’t like the spying aspect but it is unironically true that if you slam your brakes at every red light you are driving in a dangerous fashion. It’s more so about the pattern than a one off event though.
I mean without getting into the privacy nightmare piece, frequent hard braking probably means you have a habit of following too closely, or not paying attention to potential hazards and covering a brake. So I don’t think the car manufacturer should supply it but also think it would be good to let the person with the habit know so that they can learn to be a safer driver?
In the mean time EU will require systems that automatically do emergency breaks and also different signaling for emergency breaks.
Woah woah woah. I’m 99% certain that’s not how cars work.
Press the drift button?
I only see one track there I’m not sure we can do that Dave
Strange to assume that swerving will definitely kill one of them. What if you swerve off the road, or slam on the brakes? The reason the trolley problem works is that it’s on rails and you’re not operating it.
That’s because it’s a Tesla car, silly. It only allows for minimalization of victims down to a minimum of one. I’ve heard that newer models have a perdiction module, that will deploy a rear mounted gun and shot down any survivors in case of narrowly avoided car crash. The seat still does devour the driver if that happens though, for some legacy backwards compatibility reasons. As for the disembodied Voice that recites all your sins and threatens you to reveal them to the public should you NOT take the wheel and kill those people yourself, it’s apparently in spanish as well now. Such an age of wonders.
It’s a meme
It’s simple: somebody WILL die no matter what you do, this can include you (and nobody deserves it)
Yet, you’re guilty in any situation since you bought a stupid “self-driving” car
I’m sorry, but this is the vanilla trolley problem. Save all but one or avoid going to jail.
I think that’s the point. There’s a follow-up about killing the people tying others to the rails that fits.
I hope this isn’t law anywhere. You’re liable for your car no matter what. You have to take control if necessary
I saw a headline about Mercedes offering an autopilot that doesn’t require the driver to monitor, so it’s going to be interesting to see how laws play out. The Waymo taxi service in Phoenix seems to occasionally run in with the law, and a remote service advisor has to field the call, advising the officer the company is responsible for the car’s behavior, not the passenger.
So in theory the manufacturer takes responsibility because they trust their software. This puts the oness on them and their insurance, thereby reducing your insurance considerably. In actuality your insurance doesn’t go down because insurance companies.
I’m not trying to be the grammar police, just thought you might like to know that it’s “onus”.
You’re liable for your car no matter what
Nope, it should be law that if an auto manufacturer sells an autonomous driving system that they advertise being able to use while driving distracted then they are liable if someone uses it as advertised and per instructions.
What you wrote is probably an auto manufacturer executive’s wet dream.
“You used our autonomous system to drive you home after drinking completely within advertised use and per manufacturer instructions and still got in an accident? Oh well tough shit the driver is liable for everything no matter what™️”
When autonomous cars are good enough to just drive people around then yeah the companies should be liable, but right now they’re not and drivers should be fully alert as if they are driving a regular vehicle.
When autonomous cars are good enough to just drive people around
they become autonomous cars. It’s not autopilot if I’m liable, simple as that.
- Then don’t call it autopilot
- What’s the point of automated steering if you have to remain 100 % attentive? To spare the driver the terrible burden of moving the wheel a couple mm either way? It is well studied and observed that people are less attentive when they’re not actively driving, which, FUCKING DUH.
Manufacturers provide this feature for the implicit purpose of enabling distracted driving. Yet they will not accept liability if someone drives distractedly.
Next in We Are Not Liable For How Consumers Use Our Product, Elon will replace the speedometer by Candy Crush with small text that says “pwease do not use while dwiving UwU”.
There are already fully autonomous taxis in some cities. Tesla is nowhere near fully autonomous, but others have accomplished it.
“Accomplished” is a strong word for something as complex as autonomous driving.
Fair, but when a company is given the authority to run fully autonomous taxis in cities that’s a huge accomplishment. Granted they are cities that don’t see things like snow storms and I’m sure there is a good reason for that.
Some auto makers have said they will accept liability… https://www.thedrive.com/tech/455/volvo-accepting-full-liability-in-autonomous-car-crashes
The cover-your-ass scenario.
In the Philosophy Crash Course there was a scenario like this. I’ll paraphrase:
You’re a traveler exploring a semi-devloped nation in South America. Coming out of the wilderness you come across a squad of soldiers. They are forcing twenty villagers to dig a mass grave. The officer to the soldiers tells you these villagers committed the state crime of supporting a rival to their leader, and are to be executed. But as you are a guest in their country, he will make you an offer: if you shoot one of them, yourself, he will set all the rest free, and then can hike to the border and beg for asylum. (A rough trek, but the neighboring country may take them).
Do you shoot one of the villagers?
Actually killing someone is rather hard on the psyche, and most of us cannot bear the thought (and might suffer from trauma as a result). But then, perhaps this is a small price to pay for nineteen human lives.
Thomas Aquinas and Kant were happy to let the soldiers kill the villagers so as to avoid committing the sin of murder, themselves. Aquinas and Kant even would not lie to the murderer at the door, or Nazi Jew-hunters to save the lives of fugitives hidden in their home, since lying was sin enough, and they would count on God to know His own. Both had contemporaries who disagreed, and felt it was proper to suffer the trauma and do what was necessary (assuming the officer of the soldiers seemed inclined to keep to his word and actually spare the remaining villagers.)
So, the cover your own ass response has a long history of backers, including known philosophers.
@uriel238 @mondoman712
In the days before Wannsee Conference (Nazis setting up death camps) but after the invasion of Poland where most executions occurred by firing squad, there were German tourists who would travel to partake in the firing squads. So the trauma is not universal across the human experience and there’s some circumstances that would cause individuals to kill. Lynchings and massacres in the US, are examples of this occurring without a war to give cover to killings.We’ve seen a similar phenomenon in some of the red states in the ideology conflict here in the US. There are people eager to kill someone just to have the experience, and who volunteer to hunt targeted groups (trans folk, lately) or as participants in an execution by firing squad. I remember in the John Oliver’s first segment on the death penalty (he did a second one recently) executions were stalled due to difficulties obtaining the drugs used in lethal injections, and firing squads were brought up. The expert pointed out the difficulty finding one executioner, let alone seven. The officials suggested recruiting volunteers from the gun-enthusiast citizenry, which the expert saw as naïve.
I can’t speak to firing-squad executions during the German Reich and the early stages of the holocaust, but I can speak to the Einsatzgruppen who were tasked with evacuating villages (to mass graves) who harbored Jews, harbored enemies of Germany or otherwise were deemed unworthy of life. The mass executions were hard on the troopers, and as a result Heydrich contended with high turnover rates.
This figured largely into the movement towards the industrialized genocide machine that pivoted around the Auschwitz proof of concept. Earlier phases included wagons with an enclosed back in which the engine exhaust was piped. The process was found to be too slow, and exposed to many service people to the execution process. The death camps were staffed to assure no-one had to interact with the prisoners and process the bodies, so no-one would have to confront the visceral reality of before and after. They were staffed so that anyone who engaged a mechanism was two steps away from the person authorizing (and taking responsibility for) the execution. The guy who flipped the switch was just following orders.
Interestingly, we’d see a repeat of this during the International War on Terror, specifically the Disposition Matrix which lead to executions of persons of interest on the field by drone strike (Hellfire missile launched from a Predator drone). During the CIA Drone Strike Programs in Afghanistan and Pakistan, the drone operation crews suffered from high turnover rate, with operators suffering from combat PTSD from having pulled the trigger on the missile launches. It didn’t help they were also required to scan the damage to assess the carnage, and identify the casualties.
Interestingly, this also presented an inverted demonstration of how the human mind can tell the difference between violent video games and the real thing. Plenty of normies play Call of Duty without dealing with the mental after-effects of war, but even when we conduct war operations from continents away, our brains recognize that we are killing actual human beings, and suffers trauma from the act. War continues to be Hell, and video games not so much.
Reminds me of the Chinese issue: you run over someone, but they are likely not dead. Will you save their life but accept having to pay for whatever healthcare costs they have until they are recovered? Or will you run over them again, to make sure they die and your punishment will be a lot lighter?