Thats pretty usual for them, will probably need a board level repair and cost at least $250. Or more if you go through apple.
When will teens ever stop texting and driving?! I have a G1 license, and I drive, but do I text and drive? NO. As long as teens keep doing shit like this, we'll keep having accidents like this.
Whenever I drive, I do not use my phone. Hopefully these new laws that the Government enacted recently will do something about people using phones and other electronics while driving.
I know people do some weird shit when in shock but the way she acted was disgusting. Shaking hear dead sisters head around and a general apathetic attitude...
That's why I have a plan (not that I can do anything about it, though): We have 2 drivers licenses for the common people; a self driving licence and a normal licence. You can apply for a self driving car licence and do some tests, or you can do the tests we do now and get the licence we have now. That way we can weed out the morons.
IKR? Makes it even more pathetic. It's one thing to drive while under the influence of electronics, but what she did is on a whole new level.
Back in the dayz... It's crazy to think we got to a point where some people text while driving, while other people even go as far as playing Pokémon Go while driving.
AI can't do everything, so some things like decision making. The paradox is that what if there's only 2 options, kill a person or kill a pet? The AI can't choose but the human can. Other things will be more effecient even with an idiot at the wheel.
no shit, but eventually well have fully self driving cars. if they have manual controls why would you need a separate license for those?
It can easily make those types of decisions actually. It's not like it'll just spaz out and crash into a lamp post. Car companies looking into self-driving tech have done surveys on this very issue; asking what choices people would make so that they can program the A.I. to react the same way. No company in their right mind would ever put a car on sale that wasn't capable of such basic decision making. I'm not sure where everyone is getting this misconception that robots can't handle moral dilemmas. All a programmer has to do is give it a hierarchy, for example dog<person<baby<you, and it will aim for whatever is lowest. Robots are programmed by humans remember, they're not mindless machines set free on the roads with a black-and-white list of road rules, any slight deviation from which will cause a system shutdown.