Car Questions

Notifications
Clear all

Self Driving Cars

  

0
Topic starter

Not a question but saw this story on Yahoo and thought I'd throw it out for comment, as we discussed this about a week ago.  Cheers!

 


13 Answers
2

I have a " Self driving car"

I'm the " self!"


Great point.


2

I left this comment on another post about this autopilot technology too so I'll summarize it to saying any human made software is prone to mistake and errors no matter how many years the software is being used and how much it has been tailored. Softwares can't think like humans can when a new situation arises. So no way in the world I would trust a self driving car unless I am the "self". Even the autopilot system on airplanes which has been around so long doesn't mean the pilot can set it and then sleep; they have to still be aware and ready to take action if needed and this is in the sky with no pedestrian or a ton of other cars etc. So on the ground with cars and people a totally autopilot car is a recipe for disaster.


I agree,
But what like to add that on my car, It keeps if self in lane, corrects the speed and keeps the distance - sometimes it attempts to do very stupid things. It's a great tool, on the highway I have no doubt that it does it better than me.
The main thing that drivers have to learn is when NOT to use the system, the main skill that drivers will have to lean is when it's time disable it and take over in manual mode.


In the hands of a good driver, it's mostly useless. In the hands of a poor driver, it's crutch that will only make them more lazy, but at least it will protect me a bit. I expect it will lead to a general drop in driving skills out there. Lets hope all the most dangerous drivers get into self-driving ones as quickly as possible.


With the dangerous drivers I wouldn't trust them in self-driving cars either because they probably wouldn't even know when to take over and use manual mode.
As with options like keeping in lane or keeping the distance some newer normal (cars that are not self-driving) are already doing it and I prefer driving myself and having these technologies to help me rather than relying totally on the car.


Also, one other point is what if the software does not respond to the person wanting to take over the self-driving mode. I'm pretty sure you guys have heard about what happened with the Boeing 737 MAX plane; I see the nose dive issue on those that the pilots couldn't override the same as what if the drivers can't take over the self-driving system of the car. It would be even a bigger recipe for disaster. Based on my experience in the tech field I just won't totally trust any human created software hundred percent because it's so possible for them to malfunction.


based on videos i've watched on Youtube, and drivers I've seen out there on the road, I'm convinced that self driving cars are already better than some of the worst human drivers out there.


I live in a big crowded city so yeah I do see crazy stuff when driving making me think so many people shouldn't even be given drivers licenses. But unless we're talking about those very horrible dangerous drivers, I still wouldn't trust fully self-driving cars. Some options to make it safer and take control of some stuff if the driver doesn't take action in time are good, but it should never be advertised as fully self-driving and it shouldn't be either.

Just recently Tesla recalled cars because they were programmed and would roll through stop signs instead of a full stop. They couldn't properly program even something as simple as this. I'm not sure I would trust Tesla with this whole self-driving thing.


@MMJ I disagree on it being useless - it's most useful in the hands of a good driver.
An inexperienced driver may try to use it where he shouldn't, an experienced driver feels its limitations.
To me? having the car do speed, distance and lane is life-changing.
(But that being said I live where the roads are the most congested in the developed world - twice as bad as Spain that's the 2nd most congested... If the roads where you live aren't always 900% beyond capacity, and aren't twisty, don't have very low speed limits that are hard to follow with too much speed enforcement - then maybe it's not insanely useful...)


1

This raises a question.

Is it possible for someone to start the car, leave it running and go inside for a minute, and the car takes off and drives itself?

Wouldn't that be a trip?

Next thing you know, the Terminator will come back in time to kill Tesla! {black}:scared:  


I thought I heard someone say they had a "summon" mode, but I'm not sure.


I think you meant to say "Warrant."


No, as in you summon the car and it drives itself to your location.


Got it, I was talking about the warrant for your arrest.


There's probably an app available for an additional fee.


1

It would be interesting to hear the defense and know if there's actually a conviction. It's a good bet the defense attorneys will blame Tesla if it goes to trial.


1

@mmj @doc Watch from 03:10 -

https://youtu.be/0ssucYoYtYE


Thanks ITWT, I appreciate it.


"What could possibly go wrong?"


What if you don't press the "brake" app on your phone? Does it just drive into a crowd of children?


if it's anything like my robot vacuum, it'll start searching for its charging station for a couple of hours 😆


Great stuff, I sincerely thank you for the comic relief, MMJoe.


1

I do agree with everyone who said it's better that you are the human control of the car. IMO, it's a bit better to have autopilot on highways than stop to go traffic because the autopilot system can make more mistakes when there is so much going on in general. I feel like autopilot is asinine in my opinion.


1

Unfortunately you can't fix "stupid"! I was next to a driver the other day on the thruway who had a newer Highlander with the "lane warning assist" lit up in her drivers mirror and she proceeded to just push me out of the lane and I leaned on my horn profusely!!!!! I have driven cars with this feature and how you can ignore such a great feature warning is beyond me! Self driving cars will just raise the risk level as well for accidents. Nothing is perfect, especially software and sensors developed and designed by humans.


0

I have no pitty for the driver.  It is not unknown all the problems Tesla has with the system. 


0

what's to discuss?

Isn't that how it should be, Tesla or not?

(the jail time I mean)


0
Posted by: @fjcruiser2014

I see the nose dive issue on those that the pilots couldn't override the same as what if the drivers can't take over the self-driving system of the car.

Well, you can’t counteract something which you don’t know exists & which the manufacturer hasn’t mentioned about, in the operating manuals..

So if the OEMs don’t explain the workings (and limitations) of the self driving system, then yes, chaos may ensue.


Actually the flight that crashed in Indonesia exactly the same plane one flight before the crash encountered the same issue and the pilot was able to override it and prevent a crash from happening and took manual control of the plane. So it wasn't impossible to override it. The issue is that it relied on readings from one sensor to kick in which definitely isn't a good idea and if with self-driving cars they use the same method then another piece for possible disaster. Also, the pilot that could override it had much more experience so people with self-driving cars also would have to have to some extent experience to override something. And third point is the MCAS system would keep on kicking in even after overriding which is a glitch in the software. So if self-driving car softwares do the same thing it would be a troublesome glitch.


MCAS was actually working as designed. It was the erroneous data from the AoA indicator which was feeding false information to the aircraft’s Air Data Computers. I don’t think you fully understand the accident’s sequence of events, which you’re referring to. But I doubt if self driving cars will ever be as complex as a jet transport aircraft.


I dunno there's a lot more stuff going on the ground (traffic, pedestrians, drivers running through red lights, road construction, deer, etc.) than there is in the air.


I don’t see drivers requiring individual vehicle type ratings with stringent licence renewals every 12 months. And training for all the above failures you mentioned, at each renewal. I’ve seen an aircraft autopilot put my airplane in an unusual attitude, out of the blue. If it was a pilotless craft, that airplane would have flown into the ground nose first with a loss of life of everyone onboard and others on the ground.


That sounds so boring. All I wanna know is, will the car notify my followers with a twitter status update and insta selfie when I start my twitch live stream. hashtag driverless!


Actually I do very well know about the sequence of events leading to what happened with those crashes. And if present in the software field you would realize how already complex these self-driving softwares are and they will just be getting more and more complex. Yes it worked as it was designed to by pushing the nose of the plane down, but at the same time there were pilots who were able to override it when it was happening based on wrong readings by the angle of attack sensor. As I stated first mistake in the design was that it would kick in based on reading from even one sensor which definitely was a design mistake (now it will rely on readings from two sensors). Also, when a car comes which is extremely software based then it means the driver of that car needs to know how to work with that software and have adequate training for using it. It's not needed now because all cars pretty much work the same and anyone can drive any car, but when that extreme amount of software comes into play then yes I think the drivers of those cars need to exactly know how to work with it otherwise I wouldn't feel safe being in roads which people don't know how to drive or better said work with a driving piece of software. With a self-driving car that the driver doesn't know how to take over some software mistake it can easily cause a crash and disaster too.


and to the point by @mountainmanjoe that there is a lot more stuff going on on the ground which I agree with and it means much more complex systems are needed if the car were to be fully self-driving. So much more can happen on the ground and relying on software to decide how to react in all those circumstances is scary. No matter how much training is built into the software there is always something unexpected and new that can happen in driving. And the more software the more than can go wrong and the more the driver of that car would need to know how to override in case the software is built in a way that constantly tries to take back control.


0

A new article and another issue with Tesla and its autopilot; phantom braking which is triggered by the sensors on the car and how they perceive even something like plastic bag.

https://www.seattletimes.com/business/tesla-drivers-report-a-surge-in-phantom-braking/

 



0

It wasn't me, the car did it.

 

The entire idea is ludicrous, a gimmick.  Why do you need the car to drive itself when you're sitting right there?


0

Lawyers will have a field day!!! 


Share: