17

I just finished watching I, Robot and I can't understand how a truck can crash.

From what I can see, cars and (trucks) have the ability for autonomous driving, ie, the driver does not need to control the vehicle. So how does a truck with a driver that is doing a double shift fall asleep at the wheel when they can be autonomous at the flick of a switch. Not to mention the fact that driving manually seems to be frowned upon as was shown with Detective Spooner's Captain saying so when he did it.

So how is it that a truck can crash when autonomous driving was the norm?

Especially in a job that required many hours behind the wheel as a Truckee, that would most certainly take advantage of. Also remember that Spooner did not class the driver as a bad person.

SQB
  • 38,680
  • 33
  • 212
  • 350
KyloRen
  • 24,598
  • 28
  • 133
  • 265

2 Answers2

32

The simplest answer is that the sequence was from an earlier time period several years (a decade?) earlier, one in which auto-driving was presumably less common or else why would they need a truck driver at all?

Note that Sarah's father also appears to have been self-driving.

enter image description here

Valorum
  • 689,072
  • 162
  • 4,636
  • 4,873
  • Isn't it explicitly stated that the driver fell asleep? – Gallifreyan Jul 23 '17 at 14:12
  • I misunderstood the premise of the question a bit :D – Gallifreyan Jul 23 '17 at 14:18
  • @Gallifreyan If the driver fell asleep, and the self-driving truck didn't know that, it would have taken control of the vehicle. If any of the 3 vehicles had automotive radar, the radar would have detected a crash was likely, the smart-car's computer brain would take over from the driver and avoided the crash. In real life, smart-cars and automotive radar are becoming more common, so this crash should never have happened in a fictional world where these technologies are ubiquitous. Maybe the screenwriters didn't know of or think through the implications of a world where all cars are smart. – RichS Jul 23 '17 at 15:05
  • 5
    @RichS - No-one in this flashback has a auto-driving car and the truck is being driven by a human. That would strongly imply that auto-drive isn't a thing yet. – Valorum Jul 23 '17 at 15:08
  • 2
    @PatrickTrentin If I could upvote you thirty trillion nine hundred and seventy eight times for this comment I would. People today treat the concept of the self-driving car the same way people of the 80's treated home computers. The self-driving car is not some automotive panacea through which all vehicular aliments shall be cured. – Magikarp Master Jul 23 '17 at 15:30
  • You are making a big assumption that some how automation would prevent this type of accident. No matter how much automation there are still real world things like weather, road surface, vehicle weight, equipment condition, missed sensor readings and much more that could still cause accidents. – Matthew Whited Jul 24 '17 at 13:59
  • @MatthewWhited - We see auto-driving vehicles operating safely at very high speeds and under a variety of conditions. – Valorum Jul 24 '17 at 14:01
  • None of those conditions involve our mid-wester winters (USA). The vast majority of self-driving cars on the road today can only handle clear skys. even those with 3D LIDAR have a hell of a time trying to figure out what to do in the rain (let alone snow and ice.) – Matthew Whited Jul 24 '17 at 14:34
  • Hell, one of the Tesla auto-pilot accidents was blinded by the sun and drove straight into the side of a semi. – Matthew Whited Jul 24 '17 at 14:35
  • https://www.theguardian.com/technology/2016/jul/05/tesla-crash-self-driving-car-software-flaws – Matthew Whited Jul 24 '17 at 14:36
  • @PatrickTrentin - You can't ever moderate for oil on the roads, regardless of how good a driver your car is and how far ahead the sensors can see – Valorum Jul 24 '17 at 21:21
1

The 3 laws: "A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law."

Just like the robot that pulled the officer out of the car vs pulling the girl, if we assume that the truck had an AI and that when it detected the sleeping driver it was activated, it would have found itself in a situation where a wreck was imminent. Given that, it would have acted by the numbers to minimize the harm to humans in general (assuming it doesn't have relativist ethics to up-weight harm to "its" human over others). We can very easily imagine that the only other option to the action it took would be to swerve into on-coming traffic and cause a much more energetic impact that would concomitantly be more likely to cause much more harm to humans. In this way the AI would have chosen the path of least harm but would have still caused/allowed harm to humans (analogous and possibly foreshadowing to VIKI's acts if we accept the givens stated).

user85779
  • 175
  • 3
  • 3
    An interesting theory, albeit one not backed up by any evidence. – Valorum Jul 23 '17 at 17:10
  • 1
    Thank you. Yes, it is an "if this is the case (AI driving) how could a crash occur" exploration, which seemed to be what the questioner wanted. Their ask seemed less about how the things happened in the movie (as your circumstantial case makes clear - no AI being available while not explicitly stated seems the most supported hypothesis) and more about how an AI could crash/cause a crash (movie reference just as an ex). @Valorum – user85779 Jul 23 '17 at 17:21
  • 1
    If it was a general "how could x happen if y" question about robotic ethics, then it would have been closed as off-topic. The OP is clearly after an in-universe explanation, ideally one supported by evidence. – Valorum Jul 23 '17 at 17:48
  • So how is it that a truck can crash when autonomous driving was the norm? The question heavily emphasised by OP IS exactly a "how could x happen if y" question. So I guess your question is, "How can a general question about robotic ethics not be closed as off-topic?" – Luke Jul 24 '17 at 04:37
  • @Luke - Auto-driving is the norm at the start of the film, set several years after the accident. The logical corollary is that auto-driving didn't become generally used until later – Valorum Jul 24 '17 at 12:49
  • @Valorum in a movie which the primary conflict driver IS robotic ethics, and a question that specifically states "How is it that x when y" where x = "a truck can crash" and y = "autonomous driving is the norm". Your assertion that it is "clear" that the OP was asking something different than what they expressly asked might be right. However, the answer given above, and the explanation for why they answered like they did, is much more founded in the OPs question. – Luke Jul 24 '17 at 22:59
  • @Luke - I disagree. None of the drivers we see are auto-driving and the mere existence of a professional driver (replaced in later years by the automated trucks we see delivering robots) would lend strong credence to the suggestion that auto-driving isn't a common thing at that point in history. The three laws simply don't apply when everyone's driving their own cars and trucks. – Valorum Jul 24 '17 at 23:47
  • @Valorum then your issue should be with the question, not the answer as I previously pointed out. Regarding your point of "when everyone's driving their own cars and trucks", aircraft have systems to fly using automation, we have professional pilots. Robots are involved in many important functions throughout manufacturing, all under the supervision and direct control of a human. Having someone in the driver's seat directly manipulating control is not a reliable way to confirm/deny the use of automated driving systems. – Luke Jul 25 '17 at 00:36