The University of Oxford’s driverless car guru Paul Newman wouldn’ t trust his kids in an autonomous vehicle just yet
The University of Oxford’s driverless car guru Paul Newman has admitted that he wouldn’ t trust his kids in an autonomous vehicle yet, but remains bullish about a near-future where we wonder why we ever drove ourselves.
Newman, speaking at the recording of ‘The Engineers: The Rise of the Robots’ a BBC Worldwide production hosted at London’s Science Museum, is Professor of Information Engineering and heads up Oxford Robotics Institute at the University.
Asked by the audience whether he would trust his children to a current driverless car he answered in the negative.
“No…not yet, ” he said but explained that the whole concept is coming on in leaps and bounds and that we were close to a world where driving was as rare a skill as horse-riding.
“I think eventually we will have vehicles that can drive themselves and the expectation is that you won’t want to drive, ” he said.
“I think it might be..become a skill like horse-riding where it’s rare to have that in some cities, but that’s a long time away.
“It’s going to be a long time until you can walk into a car dealership and get yourself a car that doesn’t have a steering wheel or accelerators and that has the same functionality as your car has today in terms of getting from anywhere to anywhere else at any time of day in any weather.”
Newman also tackled the complicated subject of programming moral choices into a car when there was an impossible situation, although he stopped short of providing a definitive answer
“Sometimes something has to get hit – that’s just physics and it might not be any fault of the machine. Two people may walk out in front of the machine at the same time.
“If you could say with 100% certainty that it could choose between hitting a 2 year old child or a ninety year old would you be morally right to make it something other than a 50% chance?
“I think I’ ve led you to the place where you can believe that it could be computed, the question that’s left is ‘do you want it to be computed?’ .
If you want to compute it I believe we can and that’s where the question lies so I think we have a moral imperative to compute that choice.
“The majority of people think that it should be a legal requirement for the car to act for the greater good.”