The difference between what a lawyer claims and what a jury believes.

Mark S.

Badlands
Well-Known Member
First Name
Mark
Joined
Oct 30, 2021
Threads
101
Messages
5,318
Reaction score
9,999
Location
St. Jacob, IL
Vehicle(s)
2021 Badlands | 2020 Escape
Reading this article brought to mind all the lawsuits involving the Bronco Sport. Lawyers make all sorts of claims in their filing documents, some of which can be pretty inflammatory (no pun intended). But just because a lawyer files a suit claiming this or that about the named defendant doesn't make those claims true. I liken the claims made in a lawsuit to the opening bid in a price negotiation. You bid as far in your favor as possible with the hope of meeting somewhere near an unnamed price in your head.

The other thing that jumped out is how the facts in the article are framed by the author. For example:

Micah Lee was driving his Tesla Model 3 on a highway outside of Los Angeles at 65 miles per hour when it turned sharply off the road and slammed into a palm tree before catching fire.
Note how it strongly implies the car turned on its own. This could have been copied and pasted right from the plaintiff's filing. If I were writing this and wanted to avoid any appearance of bias I would say:

Micah Lee was driving his Tesla Model 3 on a highway outside of Los Angeles at 65 miles per hour before the crash which took his life. The cause of the crash is the subject of the suit.
Read everything with your skeptic's hat on. All media sources and journalists have a bias. Sometimes it's obvious, but often it's not.
Sponsored

 

BravoAlpha

Badlands
Well-Known Member
Joined
Jan 21, 2022
Threads
1
Messages
307
Reaction score
568
Location
Oklahoma
Vehicle(s)
Bronco Sport Expedition MAX Silverado Z71
The “journalists“ rise through clicks and views. The more biased, fear and shock that they can convey the more money they make. Doesn’t even have to be true, just sprinkling of slightest relevance to take hold. The lawyers don‘t need to be right, they need to win.
 

coopny

Badlands
Well-Known Member
Joined
Sep 22, 2022
Threads
4
Messages
249
Reaction score
346
Location
NY
Vehicle(s)
2023 BS BL
This lawsuit and its result, I have very mixed feelings on. The Verge has an article with issues that the California DMV has on autopilot/self driving pages.

Let's start with the a defense for Tesla:
  • The driver of the Tesla car that died was allegedly using autopilot, with a blood alcohol content of 0.05%. So that's short of the 0.08 for the legal standard of a DUI in Cali, but it's enough that it could have impacted the alertness of the driver.
Now let's move on to the cons:
  • Tesla treats moving multiple ton metal boxes moving at high speed like software, not hardware. Just "patch it out". Even the way that Teslas employ much of their driver assist technology has been retroactively crippled (now when Teslas go in for service and they have radar modules to help be aware of what's around them and how fast these things are moving, the Tesla technicians are ordered to disable them to rely on cameras alone).
  • Tesla uses the defense that Autopilot and "Full self driving" (FSD) have beta warnings and tell the driver hey you have to watch. These are boilerplate disclaimers, but this is after marketing material says things like that FSD "is designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat" and that it can "[navigate] urban streets, complex intersections and freeways. So they market it as the car drives itself, then put an asterisk on a screen once that you're liable to still pay attention and be ready to take over at any time.
  • Tesla's defense is that it was "unclear" if autopilot was engaged at the time (in addition to alcohol). I don't know how Tesla wouldn't be aware. The telematics in a Tesla are constant, even if the crash destroyed any recordings local to the vehicle (which burst into flames).
This also has to be taken with context as to what the jurors were asked, which may have
CNBC said:
Matthew Wansley, a former general counsel of nuTonomy, an automated driving startup, and associate professor at Cardozo School of Law, said the steering issues in this case differ from those in other cases against Tesla.

In those lawsuits, plaintiffs allege Autopilot is defectively designed, leading drivers to misuse the system. The jury in Riverside, however, was only asked to evaluate whether a manufacturing defect impacted the steering.

“If I were a juror, I would find this confusing,” Wansley said.
In particular, that language is confusing. Can a manufacturing defect only constitute hardware? Does it apply to safety systems if there's a patchable software flaw?

There's more lawsuits to come regarding other accidents in which plaintiffs allege the drivers were injured/killed due to autopilot/FSD behavior so the next case may not be the same.

I will give a slight criticism for Ford on the Copilot Assist 360++ lane centering (which is different from the lane keeping in the standard Copilot Assist 360, it actually keeps you centered in the lane, vs. vibration/some steering assist when you're going outside it). Ford's system has no driver cam to make sure the driver is paying attention and relies off torque. So it's possible to fool it and not have your hands on the wheel. I found this image as an example on another Ford vehicle forum:
Ford Bronco Sport The difference between what a lawyer claims and what a jury believes. 1698851249020

(It should go without saying, but if you are considering doing this, please for the love of god do not.)

Teslas used to be tricked by similar tricks, but now in the Model 3 and model Y the driver facing camera looks to see if the driver is actually looking outside. So in that regard, the implementation is better than Ford's, but Tesla also grossly overmarkets what their ADAS systems are capable of (Ford does not).
 
OP
OP
Mark S.

Mark S.

Badlands
Well-Known Member
First Name
Mark
Joined
Oct 30, 2021
Threads
101
Messages
5,318
Reaction score
9,999
Location
St. Jacob, IL
Vehicle(s)
2021 Badlands | 2020 Escape
I don't know how much clearer Tesla can be; drivers are warned they are using a "beta" version of software. Is there anyone who doesn't know what that means?

BTW, both you and the CA DMV left out an important part out of the marketing material published on Tesla's website:

Some features require turn signals and are limited in range. The future use of these features without supervision is dependent on achieving reliability far in excess of human drivers as demonstrated by billions of miles of experience, as well as regulatory approval, which may take longer in some jurisdictions.
It's pretty clear to me that while Tesla designed the system to operate absent driver action, it does not currently function that way. Both this statement and the beta warning make it clear Tesla currently expects drivers to supervise the system while in operation. I don't know of any who doesn't understand auto manufacturers expect owners to supervise the computer while operating their vehicle using the various driver-assist technologies. The fact that people "trick" the systems isn't evidence of bad faith marketing by manufacturers, it's evidence of individual stupidity.

I don't know the particulars of what telemetry data is and is not logged by Tesla vehicles, what and how often the cars transmit back to Tesla, nor what Tesla retains. But assuming it DOES log whether the vehicle is in self-driving mode, and is "constant" as you claim, it's entirely possible the car was unable to transmit due to being in a location where it had no contact with a cell tower, and the on-board data storage system was destroyed in the ensuing fire. Court cases like this involve a discovery process in which parties are required to divulge such information. If plaintiffs asked for it--and I cannot imagine any competent lawyer who would not--Tesla would be required to divulge it. It seems unlikely to me that if Tesla had such data and did not divulge as required by law that no one among all of the people at the company involved in the collection and storage of this kind of data would've spoken up.
 

Bucko

Outer Banks
Well-Known Member
First Name
Kevin
Joined
Mar 16, 2023
Threads
46
Messages
1,928
Reaction score
2,726
Location
Gainesville
Vehicle(s)
2023 Ford Bronco Sport Outer Banks Area51
I for one would never trust a "self driving vehicle". And the saying that Jesus is my co-pilot? Yea, I'd trust him a lot more, but I'd still have my hands on the steering wheel.

All this nonsense of self driving and self parking...just take the bus if you don't want to drive.
 


coopny

Badlands
Well-Known Member
Joined
Sep 22, 2022
Threads
4
Messages
249
Reaction score
346
Location
NY
Vehicle(s)
2023 BS BL
I don't know how much clearer Tesla can be; drivers are warned they are using a "beta" version of software. Is there anyone who doesn't know what that means?
Plenty. Google kind of destroyed the meaning of the term by keeping Gmail labeled for five years as hundreds of millions of people used it as their primary email for years.

BTW, both you and the CA DMV left out an important part out of the marketing material published on Tesla's website:

It's pretty clear to me that while Tesla designed the system to operate absent driver action, it does not currently function that way. Both this statement and the beta warning make it clear Tesla currently expects drivers to supervise the system while in operation. I don't know of any who doesn't understand auto manufacturers expect owners to supervise the computer while operating their vehicle using the various driver-assist technologies. The fact that people "trick" the systems isn't evidence of bad faith marketing by manufacturers, it's evidence of individual stupidity.
Let's split the hair. Tricking the systems is individual stupidity, however, enabling that stupidity has consequences. Subaru and others (including now Tesla) are supplementing torque (weight) on the wheel as an input to determine if the driver is monitoring an ADAS with driver monitoring systems that include tracking - to make sure the driver is awake (not asleep/passed out) and is looking at the road. This reduces the risk of the system remaining engaged when the driver is not attentive. Prior to Tesla enabling such camera tracking for driver alertness, there were cases where cop cars chased passed out drivers who had autopilot on for many miles. This isn't the only one either. Some involved drivers who deliberately defeated the steering wheel hand detection; others were that the drivers passed out with their hands on the wheel thus it didn't disengage. The NHTSA probe into Tesla is leaning in this direction:

NHTSA Administrator Ann Carlson said:
It’s really important that drivers pay attention. It’s also really important that driver monitoring systems take into account that humans over-trust technology.
On the point of Tesla disclaiming it, tesla is calling the systems "auto pilot" and "full self driving" (charging a separate fee/subscription for FSD), and then saying later down the page that it requires monitoring for the forseeable future with no end in sight. It's misleading marketing. Ford calls the system "CoPilot", not "Ford Full self driving*" (*not actually self driving)

I don't know the particulars of what telemetry data is and is not logged by Tesla vehicles, what and how often the cars transmit back to Tesla, nor what Tesla retains. But assuming it DOES log whether the vehicle is in self-driving mode, and is "constant" as you claim, it's entirely possible the car was unable to transmit due to being in a location where it had no contact with a cell tower, and the on-board data storage system was destroyed in the ensuing fire. Court cases like this involve a discovery process in which parties are required to divulge such information. If plaintiffs asked for it--and I cannot imagine any competent lawyer who would not--Tesla would be required to divulge it. It seems unlikely to me that if Tesla had such data and did not divulge as required by law that no one among all of the people at the company involved in the collection and storage of this kind of data would've spoken up.
FSD can upload 4GB of data within a single day. However, the data consent can be opted out of.

The driver crashed in Menifee CA, which is fairly inland, so the possibility of a lack of cellular service is there. The vehicle fire may have also destroyed the event data recorder on board as well.

I think that in a civil case with the preponderance of evidence (that the driver was not legally DUI but at a BAC where ability would be impaired, and that it was inconclusive if autopilot was on or not) that the jury ruling probably makes sense (barring some other facts I'm not aware of - I can see the docket online, but not any of the filed documents).

But I wouldn't be surprised at all if autopilot was a factor here, given the probes by the NHTSA and others, and how many fatal crashes have involved Tesla ADAS systems.

Tesla leads the pack in accidents where the car has level 2 ADAS:

CNBC said:
NHTSA publishes data regularly on car crashes in the U.S. that involved advanced driver assistance systems like Tesla Autopilot, Full Self Driving or FSD Beta, dubbed “level 2” under industry standards from SAE International.

The latest data from that crash report says there have been at least 26 incidents involving Tesla cars equipped with level 2 systems resulting in fatalities from Aug. 1, 2019, through mid-July this year. In 23 of these incidents, the agency report says, Tesla’s driver assistance features were in use within 30 seconds of the collision. In three incidents, it’s not known whether these features were used.

Ford Motor is the only other automaker reporting a fatal collision that involved one of its vehicles equipped with level 2 driver assistance. It was not known if the system was engaged preceding that crash, according to the NHTSA report.
It's also no coincidence that Tesla stopped reporting Autopilot safety statistics every quarter, which they did from 2018 to 2022. The NHTSA data makes the likely reason obvious: Autopilot/FSD are getting worse, with more accidents per million miles driven.

But all of this comes from a car company with a CEO that basically promises that currentYear + 1 will be the year true FSD comes out... in 2016 he promised that by 2018, you'd be able to summon your Tesla from across the country. And the system is still nowhere near this.
 
OP
OP
Mark S.

Mark S.

Badlands
Well-Known Member
First Name
Mark
Joined
Oct 30, 2021
Threads
101
Messages
5,318
Reaction score
9,999
Location
St. Jacob, IL
Vehicle(s)
2021 Badlands | 2020 Escape
Tricking the systems is individual stupidity, however, enabling that stupidity has consequences.
We're getting into thorny issues of philosophy here. Is Ford responsible for the maniac who drove his Explorer through a crowd of people in Waukesha last Christmas? Making companies responsible for a customer's misuse of their products seems Orwellian to me. At the very least it does not bode well for innovation.

It's misleading marketing.
That's clearly your opinion. Whether or not a court agrees remains to be seen. I think Tesla provides plenty of information to its customers regarding proper use of its technology.

It's also no coincidence that Tesla stopped reporting Autopilot safety statistics every quarter, which they did from 2018 to 2022. The NHTSA data makes the likely reason obvious: Autopilot/FSD are getting worse, with more accidents per million miles driven.
The NHTSA data at the link you provided lists the total number of accidents, not accidents per million miles. Nor does it identify who/what was at fault. It also includes the following disclaimer:

The totals in some charts may be greater than the number of crashes, because of several factors, including multiple sources for the same crash, multiple entities reporting the same crash, and multiple entities reporting the same crash but with different information.
And even if the number of accidents per million miles is increasing for vehicles using driver assist that doesn't necessarily spell doom for such systems. The relevant statistic is how they compare to human-driven vehicles. As we all know, you can be the most skilled, defensive, conscientious, courteous, thoughtful, and all around best driver on the road and still get into an accident because of some other driver's mistake.

I'm not sure how Musk's overly optimistic marketing claims relate to this discussion. Do you know of any innovative company seeking investment that ISN'T optimistic about release dates?
 

coopny

Badlands
Well-Known Member
Joined
Sep 22, 2022
Threads
4
Messages
249
Reaction score
346
Location
NY
Vehicle(s)
2023 BS BL
We're getting into thorny issues of philosophy here. Is Ford responsible for the maniac who drove his Explorer through a crowd of people in Waukesha last Christmas? Making companies responsible for a customer's misuse of their products seems Orwellian to me. At the very least it does not bode well for innovation.
Darrell Brooks deliberately caused the inputs (throttle, steering, etc.) that caused the car to drive into a parade, so no, Ford has no culpability here. It'd be a different story if it wasn't a maniac hellbent on injuring and killing people and was an innocent driver relying on an ADAS that malfunctioned.

That's clearly your opinion. Whether or not a court agrees remains to be seen. I think Tesla provides plenty of information to its customers regarding proper use of its technology.
Still making it's way through the legal system (there's more than one case.)

The NHTSA data at the link you provided lists the total number of accidents, not accidents per million miles. Nor does it identify who/what was at fault. It also includes the following disclaimer:
Tesla reported the data on millions of miles driven with their ADAS quarterly from 2018 to some point in 2022. Then they stopped. They are the only ones with access to this data.

There are certain mandated reporters to the NHTSA when there's an accident involving ADAS (particularly if it's lethal), and then that filters into the NHTSA link. The NHTSA has no visibility into how many miles are driven by Tesla vehicles that have been driven without incident. Only Tesla has this data. And Tesla is no longer publishing this data to the public (ostensibly because it would have shown worsening safety).

This is of course from the same company disabling ultrasonic sensors already installed in vehicles to try to make all autopilot have parity (because the Model 3 and Model Y skipped having radar in favor of all ADAS via camera).

And even if the number of accidents per million miles is increasing for vehicles using driver assist that doesn't necessarily spell doom for such systems. The relevant statistic is how they compare to human-driven vehicles. As we all know, you can be the most skilled, defensive, conscientious, courteous, thoughtful, and all around best driver on the road and still get into an accident because of some other driver's mistake.
Culpability is a huge one. Human drivers could be 10x worse drivers, but if people find the automaker liable for an issue in an ADAS that causes a crash, then the $$$$ automaker is the target of lawsuits, and potentially not an individual who has way less auto insurance. This is why true FSD is such an aspirational goal: the standard is not "safer than a human being", it has to be many times safer than a human being because the human being is no longer going to be liable for the vehicle operation (again, when we get "true FSD", which is not what Tesla is offering right now).

Level 2 ADAS becomes a hazier territory, because theoretically the driver should be watching at all times. Use of multiple inputs (wheel torque, driver attention via camera) helps an automaker do all that they reasonably could to make sure a driver was attentive in order to keep ADAS engaged. Tesla using the driver facing camera in models that have that to gauge driver attentiveness is a step in the right direction. It also isn't invasive (like Mothers Against Drunk Driving wanting all cars to have automatically engaged alcohol detection systems that won't allow the car to start if the driver is drunk. A passive technology that could do this without the driver actively blowing into a breathalyzer is science fiction, and making all drivers who buy new cars blow into a traditional breathalyzer would be costly and inconvenient for the vast majority of drivers who are law abiding).

I'm not sure how Musk's overly optimistic marketing claims relate to this discussion. Do you know of any innovative company seeking investment that ISN'T optimistic about release dates?
I don't know of any other automaker doing it, for good reason: you can't make bold forward-looking statements like that without disclaimers. People are buying Tesla cars on the promise that soon, you'll be able to go to your phone "batman, pull the car around and drive me to the airport". It factors into both buying a vehicle

More importantly, it reflects on the cavalier attitude of Tesla on pushing ADAS capabilities that are not ready, teasing that the true holy grail of being able to nap as your car takes you is just around the corner. Tesla keeps raising the FSD price (it started at $5000, went up to $15000, and was dropped to $12000 this September as demand for their vehicles weakened) under the tease that it's just around the corner and basically already there.

These are all factors which cause people to place too much confidence in the ADAS, to not only the detriment of themselves, but others. Tesla ADAS tends to crash into stopped first responder vehicles.
 
OP
OP
Mark S.

Mark S.

Badlands
Well-Known Member
First Name
Mark
Joined
Oct 30, 2021
Threads
101
Messages
5,318
Reaction score
9,999
Location
St. Jacob, IL
Vehicle(s)
2021 Badlands | 2020 Escape
Darrell Brooks deliberately caused the inputs (throttle, steering, etc.) that caused the car to drive into a parade, so no, Ford has no culpability here. It'd be a different story if it wasn't a maniac hellbent on injuring and killing people and was an innocent driver relying on an ADAS that malfunctioned.
We are discussing whether a manufacturer should be held liable for customer misuse of its products. You brought it up in the context of a driver disabling and/or spoofing systems meant to ensure drivers supervise the driver aid technology they are using. If someone deliberately disables or spoofs features to avoid driver aid supervision then they are not innocent.

And Tesla is no longer publishing this data to the public (ostensibly because it would have shown worsening safety).
So? Do you think the plaintiff's lawyers in this case failed to ask for that data? Not knowing something is proof of nothing; innuendo is not fact.

I don't know of any other automaker doing it, for good reason: you can't make bold forward-looking statements like that without disclaimers. People are buying Tesla cars on the promise that soon, you'll be able to go to your phone "batman, pull the car around and drive me to the airport". It factors into both buying a vehicle
Okay, but what does this have to do with the topic? We are discussing whether claims by lawyers in lawsuits, and slanted articles written about those lawsuits, should be taken at face value. The whole point of my post is that the fact someone files a lawsuit isn't proof the defendant did anything wrong. In my mind you've been arguing that the jury deciding the case we are discussing got it wrong. Are you suggesting if the jury had been aware of Musk's public comments regarding the future of Tesla's driver assist technologies the verdict would have been different? How could they not be aware?

You may very well be right that Musk gets his peepee slapped by the SEC for his claims regarding Tesla's technology. You might also be wrong. We'll see. Regardless, I don't see how this is evidence refuting my point that lawyers (and media) claims are almost always as inflated as Musk's claims regarding Tesla technology.

BTW, the article you linked also mentioned the lawsuit filed by Tesla investors who claimed Musk defrauded them. That lawsuit ALSO failed. After reading all the media reports in the weeks leading up to the verdict, I wonder how many people were surprised by THAT verdict?
 

coopny

Badlands
Well-Known Member
Joined
Sep 22, 2022
Threads
4
Messages
249
Reaction score
346
Location
NY
Vehicle(s)
2023 BS BL
I feel like we're crossing some streams here.

We are discussing whether a manufacturer should be held liable for customer misuse of its products. You brought it up in the context of a driver disabling and/or spoofing systems meant to ensure drivers supervise the driver aid technology they are using. If someone deliberately disables or spoofs features to avoid driver aid supervision then they are not innocent.
Misuse of a product that has no capability to govern its own operation is different than one that is doing operation expecting human input. It's sort of like how most riding lawn mowers have a seat switch; if an owner gets up or falls off, for both their safety and the safety of others the lawn mower will stop. Someone can defeat that with a 50lb dumbbell, but most people with common sense won't, and it wouldn't be John Deere or Snapper's fault if you did that.

Similarly, Tesla and Ford vehicles have used steering wheel weight/torque as a determining factor if a driver is paying attention for steering purposes. This has been able to be defeated both by drivers with bad behavior but no specific intent to defeat (passing out due to fatigue or drug/alcohol use at the wheel with hands still on the wheel, or hands on the wheel but not looking at the road because they're distracted by something like dropping a phone or fiddling with the radio).

So? Do you think the plaintiff's lawyers in this case failed to ask for that data? Not knowing something is proof of nothing; innuendo is not fact.
I can't make any representations as to what plaintiff's lawyers asked for and what was allowed by the judge in terms of discovery because these files are not publicly accessible. Just because you ask for something in discovery doesn't mean the other party won't argue against for a variety of reasons, or that the judge will side with you in such an argument.

Tesla may have argued since there was no data that proved autopilot was on, that there was no reason to cite Autopilot data. Or statistics on its operation could unduly influence the jury. There's no way for me to say what was requested, what was argued, and what was granted.

Tesla is having increased accident rates as reported by the NHTSA as they stop publishing their own safety data. To me, the combination smells. If it looks like a duck and it quacks like a duck...
WaPo said:
Former NHTSA senior safety adviser Missy Cummings, a professor at George Mason University’s College of Engineering and Computing, said the surge in Tesla crashes is troubling.
“Tesla is having more severe — and fatal — crashes than people in a normal data set,” she said in response to the figures analyzed by The Post. One likely cause, she said, is the expanded rollout over the past year and a half of Full Self-Driving, which brings driver-assistance to city and residential streets. “The fact that … anybody and everybody can have it. … Is it reasonable to expect that might be leading to increased accident rates? Sure, absolutely.”
Cummings said the number of fatalities compared with overall crashes was also a concern.
The whole point of my post is that the fact someone files a lawsuit isn't proof the defendant did anything wrong. In my mind you've been arguing that the jury deciding the case we are discussing got it wrong.
You're taking the wrong read on my comments I'm afraid. As I already acknowledged above:
I think that in a civil case with the preponderance of evidence (that the driver was not legally DUI but at a BAC where ability would be impaired, and that it was inconclusive if autopilot was on or not) that the jury ruling probably makes sense (barring some other facts I'm not aware of - I can see the docket online, but not any of the filed documents).
Are you suggesting if the jury had been aware of Musk's public comments regarding the future of Tesla's driver assist technologies the verdict would have been different? How could they not be aware?
I think that, like in most jury selection, potential jurors are screened. If I were an attorney representing Tesla, I would probably be asking if people had strong opinions or knowledge on Elon Musk, Tesla, or on the safety of Tesla's autonomous driving. If I were a juror in that pool, I would probably answer honestly that I don't believe that Tesla's technology is safe, and I would be stricken off the juror list and sent home. It's similar to how in a criminal case, if at all humanly possible (hard in cases like the Boston Marathon bombing), that jurors not be aware of the victim, the perpetrator or the crime prior to serving on the jury.

In terms of what the prosecution did or didn't present - I can't make any representations. I can't know how much this was discussed during witness testimony, opening or closing arguments. Or what exhibits were shown to the jury.

Even experienced lawyers can make fundamental mistakes in complex bigwig cases. When Epic Games sued Apple over them charging 30% commission on all in-app purchases on the App store in a lawsuit with thousands of pages of documents, one key part that came back to bite them in the ass was that the legal argument they technically made was Apple imposing a commission (any commission at all) was unfair, and not the amount (which a lot of the supporting documentation was about- that credit card processing was more like 2-4%). The judge basically said "well, if you would have argued that 30% commission was excessive than I could have ruled on whether or not that's true, but if your legal argument that you gave to me on paper was technically that any commission is excessive, can't agree."

Winding down - I agree on your point about how the article was phrased in that sentence. I actually don't think it was deliberate clickbait by the Engadget weekend editor, I just think it was careless. Either way, it was irresponsible on Engadget's part. (Sadly less irresponsible than some outlets that are either leaning on AI generated content with little human intervention or none...)

But the overall topic of the lawsuit and the circumstances around it are presented as some grand vindication of Tesla ADAS by many media sources, and I don't necessarily agree with that. This lawsuit's jury verdict seems to hinge on it being unprovable that autopilot was engaged at the time or shortly before the crash.

This could wind up with a larger discussion of computer ethics like the THERAC-25 and the Boeing 737 MCAS software, but I feel like this rabbit hole is deep enough already.
 


Dude

Badlands
Well-Known Member
Joined
Sep 20, 2022
Threads
22
Messages
1,574
Reaction score
1,547
Location
Arizona
Vehicle(s)
2022 Bronco Sport Badlands
TL;DR
Last few posts need an executive summary at the beginning 😉
 
OP
OP
Mark S.

Mark S.

Badlands
Well-Known Member
First Name
Mark
Joined
Oct 30, 2021
Threads
101
Messages
5,318
Reaction score
9,999
Location
St. Jacob, IL
Vehicle(s)
2021 Badlands | 2020 Escape
I feel like we're crossing some streams here.
That's almost certain!

Misuse of a product that has no capability to govern its own operation is different than one that is doing operation expecting human input.
You're arguing that there are degrees of blame, and yes, I agree. But if you start down the path of holding a manufacturer accountable for customer misuse of products at what degree to you stop? In the examples you mentioned a customer not only ignored a warning to supervise the technology, they also took steps to physically defeat a feature meant to enforce compliance. While not evidence of intent to harm, I would classify that as criminal negligence.

I think that, like in most jury selection, potential jurors are screened.
Yes, both sides get to pick jurors. The process is supposed to ensure that neither side can stack the deck. I get that some lawyers are better at it than others, but we could speculatively pick apart the case all day and we would wind up right back where we are--with a verdict from a jury.

But the overall topic of the lawsuit and the circumstances around it are presented as some grand vindication of Tesla ADAS by many media sources, and I don't necessarily agree with that.
Neither do I, and that wasn't my original point. My point was that the claims made by lawyers in lawsuits are rarely accurate. They are often meant to generate headlines to encourage movement towards settlement, and readers should view them that way rather than evidence of wrongdoing as they have been presented in some posts on this forum.

This could wind up with a larger discussion of computer ethics like the THERAC-25 and the Boeing 737 MCAS software, but I feel like this rabbit hole is deep enough already.
Agreed, deep enough. Good discussion, enjoyed it!
 

CierraGraves

Badlands
New Member
First Name
Cierra
Joined
Feb 9, 2023
Threads
0
Messages
2
Reaction score
1
Location
USA
Vehicle(s)
BMW
You've raised some valid points about lawsuits and how claims are framed in articles. It's essential to approach legal matters with a critical eye and understand that claims made in lawsuits are often just the starting point for negotiations.
Media bias is indeed something we all need to be aware of when consuming news. Your advice to read everything with a skeptic's hat on is spot on.
If anyone finds themselves in a legal situation, considering legal advice from experts like https://federal-lawyer.com/whistleblower-lawyers/qui-tam/ is a wise move. They can provide valuable insights and guidance.
Sponsored

 
Last edited:
 




Top