What are some potential ethical and moral consequences (positive and negative) that could arise in the next few years as AV tech becomes more predominant?

I think we all know about the infamous ethical dilemma presented in the scenario involving an autonomous vehicle and some hapless pedestrians, known as the Trolley Problem.

Can you think of any others? (Hint: there are multitudes!) And what about solutions? Do we fall back on Isaac Asimov’s Three Laws of Robotics and just hope for the best?

1 Like

I think in the autonomous economy, we should distinguish the difference between law and morality.
Which means the law can be written into blockchain as the smart contract or other things, but moral issues should not be dealt with codes.
Otherwise, the creator will be criticized by the public.


Increasingly I think of cars as “externalizing machines”, which explains much of their recent development: every larger, more aggressive, more imposing. It’s about optimizing extraction from the commons: (1/n)

Space: bigger car, more personal space (big upside), less road space around the car (diffuse, a small downside to many), low marginal cost (weak or no price signal in rego in Aus, USA, etc)

Safety: bigger car, more mass, more inertia. Protect occupants (big upside). Higher consequences to other collision participants (externalized and diffuse downside).
Note: also incentivized by ANCAP and similar which rate only occupant safety

Noise: ever louder horns, increased soundproofing. Personal upside: loud horn more threatening to outside parties, less noise inside the car. Diffuse externalization: urban noise, louder emergency service sirens to overcome soundproofing of cars

every individual can be forgiven for wanting to get the most value for their dollar. But these external costs add up. They feed a “cold war” escalation that erodes urban and city safety and amenity. Only two levers to respond: regulation and price. Neither done well