It looks like you're new here. If you want to get involved, click one of these buttons!
Subscribe to our Patreon, and get image uploads with no ads on the site!
Base theme by DesignModo & ported to Powered by Vanilla by Chris Ireland, modified by the "theFB" team.
Comments
Studio: https://www.voltperoctave.com
Music: https://www.euclideancircuits.com
Me: https://www.jamesrichmond.com
And "braking hard" is not a universal solution.
I'm glad you worked in the City, not in writing safety-critical software!
BBC News: US opens investigation into Tesla after fatal crash
US opens investigation into Tesla after fatal crash - http://www.bbc.co.uk/news/technology-36680043[/quote]
Tesla has not made any statement of "why", only that "neither the Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied". That much is certainly true.
They haven't mentioned this being due to blind spots or sensor failure or anything else. Don't get me wrong, it's likely that something should be changed in the sensor system and/or software, but at the moment it's a VERY carefully worded statement that avoids making any statement that they do not know to be true.
What you can be sure of is that they'll spend tens of millions to ensure the chance of this happening again is significantly reduced.
A number of people will always want sport cars and that won't disappear within our generation, but for the majority of people not having to actively drive would be brilliant.
And of course you are right that they will fix it. But as I said before, the law in this area is vague at best and wide open for litigation (rather like the concerns over US drone use law) and the car makers are being foolish to address it through testing cases like this.
Precisely why I posted it in the first place.
One thing does occur to me though: How many lives have been saved by the system when it does work?
I'm glad you don't work for Tesla!
But, on second thoughts it would actually explain a lot... :-?
Part of the potential problem of this kind of tech though is that it presents a "worst of both worlds" situation, where the driver is bored and less alert because the car is doing the bulk of the work. But then the driver is suddenly expected to jump in in the event of danger, slowing reaction time massively. -or potentially turning a danger the computer could have avoided into a human controlled crash.
Watching a video while "driving" is dumb but it's the kind of user error you can expect with this kind of tech in a Murpheys law kind of way, I'm pro self driving cars but it really needs to be fully autonomous or nothing.
What you suggested was that if the car loses data from one sensor it is "completely blind". Not true.
You then suggested that having lost data from one sensor it should brake hard. No qualification.
That is patently ridiculous.
Now you're trying to bend what I've posted into a different scenario - which didn't happen - to try to fit a claim you didn't make.
Good luck with that!
Er, Tesla said their sensors didn't see the articulated truck.
From what I've read elsewhere it's at higher speeds that autonomous cars have more problems. The distance ahead that the sensors can see accurately and process it quickly enough isn't that high.