Critical systems once operated by humans are now becoming more dependent on code and developers. There are many benefits to machines and automation such as increased productivity, quality and predictability. But when websites crash, 911 systems go down or when radiation-therapy machines kill patients because of a software error, it’s vital that we rethink our relationship with code and as well as the ethical and moral obligations of machines and humans.
Critical systems once operated by humans are now becoming more dependent on code and developers. There are many benefits to machines and automation such as increased productivity, quality and predictability.
But when websites crash, 911 systems go down or when radiation-therapy machines kill patients because of a software error, it’s vital that we rethink our relationship with code and as well as the moral obligations of machines and humans.
Should developers who create software that impact humans be required to take a ‘do no harm’ ethics training? Should we begin measuring developers by the functionality they create as well as security and moral frameworks they’re able to provide?
Other articles discussed:
Panelists: Cindy Ng, Kilian Englert, Kris Keyser, Mike Buckbee