Protest against military use of AI. More than 10 Google employees resign collectively, more than 4,000 signatures gathered

Protest against military use of AI. More than 10 Google employees resign collectively, more than 4,000 signatures gathered

A dozen Google employees submitted their resignations on May 14 in protest of the Pentagon's contract. It started with the Project Maven project, which provides AI technology for the operation of military drones.

Project Maven is a project that aims to speed up analysis by automatically classifying people and objects captured by military drones. It's been three months since it became clear that Google plans to provide its own AI. A signature campaign was taking place inside the company to protest against military use.

The resigning employees shared their thoughts within the company, such as ethical concerns about using AI for drone weapons, voices wary of the company's political movements, and concerns about losing credibility. , and Gizmodo was able to cover the content from multiple sources.

Several of the resigning employees who responded to the interview complained that management has become less open about the matter internally and has become less responsive to employee protests. Project Maven plans to apply AI technology to the classification of images taken by drones, but it is natural that such important matters related to human life should be processed by humans rather than algorithms, and in the first place, Google is not a job for the military. Some people thought we shouldn't stick our heads in.

Since its founding, Google has valued an open corporate culture, and has aimed to be an open company where employees can speak freely, including pros and cons, even when deciding the direction of development. However, the current top management does not listen and seems to feel that employees who raise objections will have no choice but to drop out. I was just getting more and more disappointed," he said of his feelings that led to his resignation.

I don't even feel like recommending people to join the company. Then why am I still working there? I thought

When Google banned the posting of sexual content on Blogger in 2015, there were protests both inside and outside the country, and there were cases where Google relented and withdrew the ban, but employees collectively protested and resigned. this is the first time. It seems that it is a manifestation that employees have a sense of crisis.

A petition calling for the immediate termination of Project Maven's contract and the creation of a new policy stipulating no military involvement ended up with nearly 4,000 signatures from Google employees.

However, there is no noticeable change in the company's policy, and it just repeats the excuse that "Maven is not military cooperation." We are also willing to bid on the Joint Enterprise Defense Infrastructure (Jedi for short), another mega-project to build a super-secret cloud system for which the Pentagon is soliciting bids to strengthen its defenses against Russia and China.

The software we contribute to Project Maven is open source, so even if Google refuses to take money or provide technical support, the military is free to use it whenever they want. This makes it difficult for people to protest, but those who quit Google accept that their active involvement in Marven and Google's corporate motto of "do no evil" are completely incompatible and cannot be overlooked. and he speaks like this.

If you're a small machine learning startup that has to find clients in various industries, Google is different. It is wise to stay away from danger

At the end of February, when rumors of Maven's involvement spread throughout the company, the company was in an uproar. They look refreshing. It seems that the company promised to revise its ethics policy in a few weeks, and one of the employees said, "Isn't it the proper order to establish an ethics policy and then sign a contract?"

Protest against military use of AI. Google More than 10 employees resign collectively, more than 4,000 signatures gathered

Although Google insists that its "homegrown AI will not be used to kill," the use of AI in the Pentagon's drone development program is highly discouraged by researchers and developers in the field of machine learning. There are still a number of thorny ethical and moral issues that remain unresolved.

In April, the Tech Worker Coalition also launched a petition against it. "Our industry and technology have problems with biases that harm people, massive betrayal, and a lack of ethical safeguards that can no longer be ignored. It's a matter of life and death. ” is the purpose of the countersignature. It has asked Google to stop its involvement with Maven, and has asked major technology companies, including IBM and Amazon, to refuse to do business with the US Department of Defense.

Additionally, more than 90 artificial intelligence, ethics, and computer academia recently issued an open letter asking Google to end its cooperation with Project Maven. This is calling for the establishment of an international treaty that prohibits autonomous weapons, and the authors include Peter Asaro and Lucy Suchman, who appealed to the United Nations about the dangers of autonomous weapons, and Professor Lilly Irani, a former Google employee. increase.

When I interviewed Mr. Suchman, he said that if Google provided technology to Project Maven, the development of fully autonomous weapons would be accelerated at once. Google has a corporate responsibility to it, and it has to outweigh its responsibility to a nation's military." His open letter reads:

If technology companies are required to take into consideration who gains and loses from technology, they can use algorithms to identify human targets, remotely kill them, and do this without public accountability. There should be no theme that needs to be questioned more seriously now than the pros and cons of what is being done. Google entered the military business without questioning public debate or deliberation. If it becomes normal to decide the future of Google technology without questioning the world, the company's entry into military technology will result in highlighting the danger of private control of information infrastructure.

Google management is also working hard to convince employees. Diane Green, CEO of Google Cloud, was a hard defender of Project Maven at a meeting shortly after the deal came to light, multiple insiders told me. After that, Green CEO and employees held several sessions, and those who agreed with Maven and those who opposed said their opinions, explaining how difficult it is to formulate guidelines for ethical use of machine learning.

Some of the employees who quit said that they were dissatisfied with this incident, Google's investment in the Conservative Political Action Council, and diversity measures. The person mentioned above said, ``I decided to quit when I realized that I couldn't recommend people to join the company.'' In addition, there were also voices such as:

It's for Google to decide, not for you. You don't have to feel responsible for everything your company does. I've been telling myself so. But after all, when there is something I can't accept, I can't help but blame myself

By the way, a Google spokesperson said in an April statement:

Google values ​​a corporate culture in which employees are actively involved in the work they do. I think that there are various questions about the use of new technology, so it is important and beneficial to have exhaustive discussions with employees and outside experts like this. As for the technology this time, it will be used in the part that flags the image that will be sent to manual screening, saving lives and eliminating the need for people to do troublesome work. The military use of machine learning is of course questionable. As this is an important topic, we will continue to actively engage in discussions with experts across the company and with external experts, and continue to formulate policies regarding the development and use of our own machine learning technology.

We have not yet received a response to our request for comment on the en masse resignation. But hey, what employees want is action, not statements. This means establishing a code of ethics, breaking a contract, or both.

Finally, I was impressed by what the quitting employee said.

Action over theory. I want to be like that too. I thought that it would not be possible to just gossip about this issue within the company, so I came up with the strongest statement in my own way. and decided to quit

Image: turtix/Shutterstock.comSource: U.S. Department of Defense, Business Insider Japan, The New York Times, Tech Worker CoalitionKate Conger - Gizmodo US [original] (satomi)