Loading...

How experiments become futures: Social learning for self-driving cars

@Universität für Bodenkultur Wien (BOKU), 27 June 2019
@Universität für Bodenkultur Wien (BOKU), 27 June 2019
What is the connection between the tomato harvester and self-driving cars? Dr Jack Stilgoe, Associate Professor at University College London (UCL) discusses a broad range of questions around the regulation of self-driving cars. Read More

Tomato harvesting and self-driving cars

“This is a tomato harvester,” said Dr Jack Stilgoe, associate professor at University College London (UCL), gesturing towards his presentation slides. The slides showed an image of an agricultural machine in action on a tomato plantation. While the audience was still trying to figure out the connection between the tomato harvester and social learning for self-driving cars, Stilgoe continued: “If you’re a juicy, tasty, and soft tomato, you have little chance of surviving the tomato harvester. The tomato harvester favours dense, hard, and absolutely tasteless tomatoes.”

Stilgoe further explained that the emergence of the tomato harvester has had far-reaching consequences for the tomato market and beyond. For example, apart from encouraging the production of low-quality tomatoes, it has significantly reduced the number of migrant farm workers. However, at the time the tomato harvester was first introduced, few understood it as “an artefact with politics”. Therefore, its effects on society have been thoroughly described and discussed only in retrospect. With self-driving cars, we are again about to miss the window for a timely and proper public discourse.

Where do we come from?

To show how self-driving cars have been imagined in recent history, Stilgoe used two examples.

A very early vision of a self-driving car was demonstrated with a film at the General Motors “Motorama” auto show in 1956. The self-driving car in the film only worked on dedicated roads with navigation aids for the car. Very much like the tomato harvester, self-driving cars have been having an effect in their field of operations from early on.

In 2007, DARPA (the US Defense Advanced Research Projects Agency), an agency of the US military, held public demonstrations of self-driving vehicles. At first glance, the vehicles looked pitiful at best. Although driving at low speed, they still repeatedly crashed into each other and into obstacles. At second glance, however, the demonstration of vehicles driving by themselves – however badly they did so – showed that what had once seemed impossible was now inevitable.

Are we nearly there yet?

How far have we come since DARPA’s demonstration more than ten years ago?

What might give us a realistic impression of the progress we have made or not made since “Motorama” in 1956 is the new self-driving bus for the “Seestadt Aspern”, an area in the east of Vienna. Only recently, in June 2019, the “auto.Bus – Seestadt” started its public transportation test service. Its operation is highly restricted. It has a maximum speed of 20 kph, it operates in a newly-built area without much traffic, dedicated stops have been built and equipped with sender units, and a human operator sits on the bus and has the ability to intervene if necessary.

However, a much more advanced stage of development is suggested by countless videos on YouTube about Tesla’s “Autopilot System”. Many of these are even speeded up to give the appearance of higher driving speed. “The videos are mere performances and should be treated as such. They are doomed to succeed”, Stilgoe added. The problem is that such demonstrations make self-driving cars seem more sophisticated and safer than they really are. This is especially the case since neither the enthusiastic users nor the manufacturers themselves are very keen on emphasising how immature the technology still is.

To illustrate the severe consequences of this overestimation, Stilgoe introduced the audience to Joshua Brown, the first person to die in a self-driving car accident, and to Elain Herzberg, the first bystander to be killed in a self-driving car accident. In the course of the investigation by the NTSB (the United States National Transportation Safety Board) that followed Joshua Brown’s death, the NTSB criticized Tesla for not doing enough to prevent the misuse of the “Autopilot System”.

Did we take a wrong turning?

The short history of self-driving cars has been shaped by innovation without permission, and premature experiments done by manufacturers. Self-driving cars were sold like finished products, but manufacturers did not take responsibility for the failures that occurred, as it was – after all – only “public beta phase” testing.

The “public beta phase” testing and its problems led straight to a lively discussion (lively despite the fact that it was 35 degrees Celsius in Vienna that day), in which the audience and Stilgoe touched upon a broad range of questions around the regulation of self-driving cars. Is there a right moment to expand the laboratory from a fenced-off territory to the public at large? How proactive could policy-makers be and how proactive are they in reality? Do we need a stricter framework for labelling zones of experimentation, so that at least bystanders are informed about the risk that comes with self-driving cars?

Although these questions are complex, coming up with answers and developing solutions is not impossible. What counts is that the questions are asked in time, because – for all we know – we might be the self-driving car equivalent of a juicy, tasty, and soft tomato.

Thomas Buocz, July 2019

The report in PDF format is available here .

Home
Events & News
Contact Us
Home
Events & News
Contact Us
2019-09-27T22:42:12+00:00