When Google agreed to help the Pentagon with AI for defense purposes, the employees at Google threw a fit. They didn’t want their company to be remotely connected with war.
4000 employees signed a petition that they didn’t want Google to go ahead with Project Maven, a program by Pentagon which focused on the targeting systems of the military’s armed drones. Google is supposed to contribute artificial intelligence technology to the program. Essentially Google is required machine-learning algorithms to help military drones.
Many employees have also resigned. In the petition, the employees said, “We cannot outsource the moral responsibility of our technologies to third parties,” The employees felt that this puts Google reputation at risk and was in direct opposition to the core values of the company. A resigning Google employee told Gizmodo, “At some point, I I could not in good faith recommend joining Google, knowing what I knew. I realized if I can’t recommend people join here, then why am I still here?”
This means that the absorption of the mission statement of the company by the employees is well understood and complete. While mission statements are supposed to be the guiding lights for companies, most often companies forget about their mission and values in the throes of business and the pressure of meeting short-term goals. So it is quite amazing that the employees of Google are proving to be the conscience of the company and reminding them of the stated values and mission. The also said that Google should not be in the business of war.
In part, the greatest worry seemed to be whether artificial intelligence could actually differentiate between military and civilian targets. In the past, often civilian targets have already been hit by drones, thanks to human error. Is there any AI error when the tasks done by humans are replaced by AI. What comes to mind is the Uber self-driving car which last year killed a human being. It does mean that while the virtues of delegating human tasks are many no one is quite sure what mistakes are likely to happen when we employ AI to do human tasks.
The more difficult question is that of responsibility. If a human driving the Uber car killed a human being, a human can be held responsible and the law can take its course by delivering the appropriate punishment. But when AI makes a mistake, it is likely that there is no one to punish. This means that mistakes made by AI as in the Uber case would need to be forgiven by society since there is no legal recourse to AI errors.
In many ways, the issues raised by Google employees is a first. It is the first time employees are telling their company what is ethical and what is not rather than the other way around. So it is being seen as a unique case of ethical activism from employees. But the general public seems to be supporting the
A dozen Google employees resigned in protest over its involvement in military contracts using machine intelligence to control drones. This is some sterling ethical activism and should be getting more press: https://t.co/7UMohMzDtg
— Laurie Voss (@seldo) May 14, 2018
Those who think that ethical approaches to business are a "nice-to-have" may be surprised by the resignations at #Google over #ProjectMaven. Smart companies will use #ethics as a competitive advantage, to win the best talent. But t…https://t.co/E7IgTi7fy2 https://t.co/OJfFqYbjwb
— Tom Upchurch (@t_upchurch) May 15, 2018
the Electronic Frontier Foundation (EFF) and the International Committee for Robot Arms Control (ICRAC).
It is a case where employees are trying to impose their will on the will of the company. But how will it turn out? A long time ago, one of my bosses told me that the corporate ego is bigger than personal egos. If that is true Google will win this battle. There is also a nationalistic element in this project. The counter-argument is should a company deny the country the support it requires?
Project Maven at Google is a new form of employee activism and would be interesting to watch how this pans out for the company.
Connect with me on twitter