go to top
ADVERTISEMENT: GIO-121124-100

Self-driving cars don’t need drivers but they do need humans

safer than humans

There’s a clever twist in the self-driving car hype. First we hear self-driving cars will be safer than regular cars because humans are just too fallible. Then we hear we are crazy for thinking these vehicles will be safe without human oversight. As a result, it seems self-driving cars don’t need drivers but they do need humans.

Promises, promises

Look no further than Elon Musk for inflated promises for self-driving cars (or AVs). He plans to convert Tesla’s electric cars into “fully” self-driving robotic taxis in April or June next year!

Meanwhile, 60 companies are scrambling to build an AV that is safer than a human and many experts say this is still at least 10 years away. Waymo has already said a fully autonomous (level 5) vehicle will never happen and HERE Technologies says it could be 30 years.

Now Dr Maarten Sierhuis, Nissan’s chief technology director, says level 5 AVs may never happen and certainly not in the way everyone claims:

“Why do people think we can have millions of these vehicles just driving around and not needing any human interaction?… Show me an autonomous system without a human in the loop and I will show you a useless system.” Seems like we do need humans after all.

Humans in the loop

The Nissan idea of “humans in the loop” is similar to the way air traffic controllers work. Even though planes have two pilots and a computer system, air traffic controllers are responsible for them within their geographic area.

Sierhuis says we can apply this idea to self-driving cars. A remote human controller would take responsibility for a fleet and make instinctive human decisions when the vehicles cannot. In this view:

  • System is always autonomous
  • Car is responsible for safe driving
  • Human-in-the-loop supervises the car and makes decisions when it needs human intelligence.

Of course, under this system individuals do not own the vehicles. Manufacturers like Nissan want to use them commercially as robotaxis or for ride hailing services.

Accept their limitations

Professor Regan of Australian Road Research Board says the success of AVs will depend hugely on people knowing the limitations of these systems at each level of autonomy. Already we know many drivers are too trusting with advanced driver assistance technologies – that’s only level 2. Level 5 is full autonomy.

So if you previously thought humans were the ones with limitations, we must learn to accept autonomous technologies have their limitations too.

Ford said recently it had overestimated the arrival of AVs because of their immense complexity. Its 2021 model will be “geo-fenced”. A geo-fence is a virtual perimeter created for a real geographical area using GPS. The vehicle will not be capable of driving outside it. Regan also says the first self-driving cars will function reliably only under certain conditions, such as on the freeway.

Humans are the best

These are big limitations compared to human capabilities. Meanwhile, AVs will need about a billion lines of code, more than a thousand times more than Apollo needed for a moon landing.

Imagine if only one line of it goes wrong. You could end up on the moon.

Dr Sierhuis of Nissan says, “[Humans] are the best autonomous systems I know of.” But is anybody listening to him?

author image

Corrina Baird

Writer and Researcher

Corrina used to lend her car to her kids and discovered what Ls, Ps and demerits mean for greenslips. After 20 years in financial services and over 9 years with greenslips.com.au, she’s an expert in the NSW CTP scheme. Read more about Corrina

Ask us anything about green slips

If you didn’t find the answer to your question, please send it to us and we will answer it as soon as we can.

your opinion matters: