Tesla Auto Pilot cars

What's Hot
13»

Comments

  • octatonicoctatonic Frets: 33989
    The responsibility is still with the driver.
    0reaction image LOL 0reaction image Wow! 3reaction image Wisdom
  • ChalkyChalky Frets: 6811
    Sporky;1136432" said:
    Chalky said:

    Sporky;1136314" said:Is braking hard always safe?

    My point it that it has a known state in which its safety is lost - it should handle that in the safest manner it can and loss of kinteic energy appears to be the common choice on these systems.





    Having worked on some real-time systems I can assure you it's a lot more complex than that.



    Emergency braking when data is incomplete is ludicrous. What if the rear sensors fail? Should the car slam on the brakes then?
    Then you should bloody well know the system should not permit a known error state to persist, and has to reduce its risk profile as quickly as possible to the point where the known error state is fixed or neutral.
    0reaction image LOL 0reaction image Wow! 0reaction image Wisdom
  • SporkySporky Frets: 29194
    Chalky said:
    Then you should bloody well know the system should not permit a known error state to persist, and has to reduce its risk profile as quickly as possible to the point where the known error state is fixed or neutral.
    Indeed.

    And "braking hard" is not a universal solution.
    "[Sporky] brings a certain vibe and dignity to the forum."
    0reaction image LOL 0reaction image Wow! 2reaction image Wisdom
  • ChalkyChalky Frets: 6811
    Er, earlier commenters on this thread (and I trust them) suggest he was doing 85mph so I would humbly submit that loss of kinetic energy would be the first choice in this case... ;)
    0reaction image LOL 0reaction image Wow! 0reaction image Wisdom
  • SporkySporky Frets: 29194
    So as soon as the sun is bright enough to limit one sensor the car should "brake hard" at 85mph?

    I'm glad you worked in the City, not in writing safety-critical software!
    "[Sporky] brings a certain vibe and dignity to the forum."
    0reaction image LOL 0reaction image Wow! 3reaction image Wisdom
  • stickyfiddlestickyfiddle Frets: 27833
    Chalky;1136424" said:
    [quote="stickyfiddle;1136408"]Chalky said:

    Sporky;1136314" said:Is braking hard always safe?

    My point it that it has a known state in which its safety is lost - it should handle that in the safest manner it can and loss of kinteic energy appears to be the common choice on these systems.





    I think the point, at the moment, is that it's not yet clear whether the car was at fault, save for a handful of commenters on news articles claiming the car has a blind spot.
    No, Tesla themselves have made a statement of what they believe happened and why their system didn't see the truck.

    BBC News: US opens investigation into Tesla after fatal crash
    US opens investigation into Tesla after fatal crash - http://www.bbc.co.uk/news/technology-36680043[/quote]

    Tesla has not made any statement of "why", only that "neither the Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied". That much is certainly true.

    They haven't mentioned this being due to blind spots or sensor failure or anything else. Don't get me wrong, it's likely that something should be changed in the sensor system and/or software, but at the moment it's a VERY carefully worded statement that avoids making any statement that they do not know to be true.

    What you can be sure of is that they'll spend tens of millions to ensure the chance of this happening again is significantly reduced.
    The Assumptions - UAE party band for all your rock & soul desires
    0reaction image LOL 0reaction image Wow! 0reaction image Wisdom
  • stickyfiddlestickyfiddle Frets: 27833
    stratosonic;1136434" said:
    Personally I enjoy driving too much to want it automated to the point of being a passenger? 

    One thing I've wondered about is the legal/insurance issues about who pays out if the technology makes a mistake and causes an accident. E.g many cars now have automated parking, you take your hands off the wheel and let the car park itself, what if it has a software glitch and reverses into the expensive car behind. It wasn't the drivers fault, the car was parking itself? 
    I agree, but I also take a lot of cabs, which would be safer with an automated car, and I commute to work, which is not something I need to do myself. And the same is true for millions of people. My mum doesn't need to drive herself to the shops. Your average 20-somethings would rather be on their phone than driving a car in traffic (gross generalisation, obviously, please substitute for whichever unnecessary stereotype you prefer), and I would rather they had a driverless car as it would be less likely to drive into me in the same traffic.

    A number of people will always want sport cars and that won't disappear within our generation, but for the majority of people not having to actively drive would be brilliant.
    The Assumptions - UAE party band for all your rock & soul desires
    0reaction image LOL 0reaction image Wow! 0reaction image Wisdom
  • ChalkyChalky Frets: 6811
    @stickyfiddle Agreed but their statement is clear that White truck against Brightly lit sky = Invisible to sensors. I wonder if they publicised that as a known error scenario?

    And of course you are right that they will fix it. But as I said before, the law in this area is vague at best and wide open for litigation (rather like the concerns over US drone use law) and the car makers are being foolish to address it through testing cases like this.
    0reaction image LOL 0reaction image Wow! 0reaction image Wisdom
  • chillidoggychillidoggy Frets: 17137
    Chalky said:
    Its ok. Elon Musk says ""The driver cannot abdicate responsibility". So you sell me a car with functions that I can use but must not trust.

    New paradigm? This kettle can boil water but it must be watched at all times because you cannot abdicate your responsibility and trust it to boil water and turn itself off?


    Precisely why I posted it in the first place.

    One thing does occur to me though: How many lives have been saved by the system when it does work?


    0reaction image LOL 0reaction image Wow! 1reaction image Wisdom
  • ChalkyChalky Frets: 6811
    octatonic;1136437" said:
    The responsibility is still with the driver.
    Unless the manufacturer is promoting delegation of that responsibility to the car. The case might hinge on that, especially thinking of cases that result in "contains hot coffee" outcomes ;)

    0reaction image LOL 0reaction image Wow! 0reaction image Wisdom
  • ChalkyChalky Frets: 6811
    Sporky;1136471" said:
    So as soon as the sun is bright enough to limit one sensor the car should "brake hard" at 85mph?



    I'm glad you worked in the City, not in writing safety-critical software!
    So let me get this straight Mr Safety Critical Man. On a 55mph road with side roads from which vehicles such as, oh I guess, a large articulated truck can suddenly pull out, you don't think a car doing 85mph should brake hard in case there is an even faster vehicle behind him?

    I'm glad you don't work for Tesla!

    But, on second thoughts it would actually explain a lot... :-? :))
    0reaction image LOL 0reaction image Wow! 0reaction image Wisdom
  • hugbothugbot Frets: 1528
    edited July 2016
    Steffo said:
    The guy was watching a movie and apparently going at over 85mph when the trailer crossed the road with sun at a particularly bad angle blinding the nav system. Technology can help only so much if the person is hell bent of winning the Darwin Award.

    Part of the potential problem of this kind of tech though is that it presents a "worst of both worlds" situation, where the driver is bored and less alert because the car is doing the bulk of the work. But then the driver is suddenly expected to jump in in the event of danger, slowing reaction time massively. -or potentially turning a danger the computer could have avoided into a human controlled crash.

    Watching a video while "driving" is dumb but it's the kind of user error you can expect with this kind of tech in a Murpheys law kind of way, I'm pro self driving cars but it really needs to be fully autonomous or nothing.
    0reaction image LOL 0reaction image Wow! 0reaction image Wisdom
  • SporkySporky Frets: 29194
    Chalky said:
    So let me get this straight Mr Safety Critical Man.

    OK, Mr "Always brake hard at 85mph" man.


    On a 55mph road with side roads from which vehicles such as, oh I guess, a large articulated truck can suddenly pull out, you don't think a car doing 85mph should brake hard in case there is an even faster vehicle behind him?
    Maybe you should re-read what you originally posted, instead of backing yourself into a corner and then making childish snipes and making up what other people have said.


    Chalky said:
    It can't deal with an utterly predictable and computable event such as where the bloody sun is and therefore brake the car hard because it is now completely blind?!
    What you suggested was that if the car loses data from one sensor it is "completely blind". Not true.

    You then suggested that having lost data from one sensor it should brake hard. No qualification.

    That is patently ridiculous.

    Now you're trying to bend what I've posted into a different scenario - which didn't happen - to try to fit a claim you didn't make.

    Good luck with that!
    "[Sporky] brings a certain vibe and dignity to the forum."
    0reaction image LOL 0reaction image Wow! 4reaction image Wisdom
  • ChalkyChalky Frets: 6811
    @Sporky said "What you suggested was that if the car loses data from one sensor it is "completely blind". Not true."

    Er, Tesla said their sensors didn't see the articulated truck.
    0reaction image LOL 0reaction image Wow! 0reaction image Wisdom
  • HeartfeltdawnHeartfeltdawn Frets: 22601
    Chalky said:
    @Sporky said "What you suggested was that if the car loses data from one sensor it is "completely blind". Not true."

    Er, Tesla said their sensors didn't see the articulated truck.
    Is there anything you're not an expert on, Chalky? 





    5reaction image LOL 0reaction image Wow! 2reaction image Wisdom
  • ChalkyChalky Frets: 6811
    Heartfeltdawn;1137251" said:
    Chalky said:

    @Sporky said "What you suggested was that if the car loses data from one sensor it is "completely blind". Not true."



    Er, Tesla said their sensors didn't see the articulated truck.










    Is there anything you're not an expert on, Chalky? 
    Don't have to be an expert to read Tesla's statement!
    0reaction image LOL 0reaction image Wow! 0reaction image Wisdom
  • crunchmancrunchman Frets: 11532
    If the car was doing 85mph then that is a problem in itself.  You would expect self driving cars to stick to the speed limit.  A fully autonomous car might have actually been safer here as it would only have been doing the speed limit.  Semi-autonomous might be the worst option as @hugbot suggested.  The driver is still able to get it to 85mph while he isn't as alert as he would normally be at 85mph.

    From what I've read elsewhere it's at higher speeds that autonomous cars have more problems.  The distance ahead that the sensors can see accurately and process it quickly enough isn't that high.
    0reaction image LOL 0reaction image Wow! 0reaction image Wisdom
Sign In or Register to comment.