Motor vehicles in all shapes and sizes are being developed to function autonomously. Autonomous vehicles (AVs) are in the news every day. Vehicle manufacturers, giants in the computer industry, government agencies and consumer-safety advocates alike have been touting the advent of AVs, and their potential to reduce motor vehicle accidents, injuries and deaths. What lingers like a dark storm cloud are questions about how these machines should be regulated to assure the public that these robotic machines are safe and fool-proof. Because AVs are programmed to behave and react to circumstances and situations their “creators” foresee, we are only now beginning to ask “how should AVs be programmed to respond to emergent conditions?” This question requires programmers, manufacturers, governmental agencies and consumer safety advocates to define the choices these machines will be required to make when emergencies arise. As just one example, consider the question how AVs should be programmed to respond to this emergency situation: (["The social dilemma of autonomous vehicles," Bonnefon, et al, Sciencemag.org., Vol. 352 Issue 6293, 1573-1576 (2016).])

In 1942, science fiction author Isaac Asimov wrote a short story called “Runaround.” He postulated that when robotic machines populate the world a hundred years hence, they must be programmed with “the three laws”: