Google Renounces AI for Weapons, But Will Still Sell to Military

Adjust Comment Print

Technologies that gather or use information for surveillance violating internationally accepted norms.

Following the anger, Google decided not to renew the "Maven" AI project with the US Defence Department after it expires in 2019.

But the potential of AI systems to pinpoint drone strikes better than military specialists or identify dissidents from mass collection of online communications has sparked concerns among academic ethicists and Google employees.

The internal and external protests put Google in a hard position as it aims to recenter its business around the development and use of artificial intelligence.

More news: Tropical Storm Aletta grows stronger in Pacific

Google has also made other moves that it may not have done in the past, such as block apps and tools that try to avoid censorship in other countries from using its cloud platform.

However, Google went on to confirm that they will continue to work with government bodies and military.

The AI principles represent a reversal for Google, which initially defended its involvement in Project Maven by noting that the project relied on open-source software that was not being used for explicitly offensive purposes. In response, they circulated an internal letter, arguing that "Google should not be in the business of war".

The company has detailed a total of seven principles guiding its ongoing AI work, which it vows will play a crucial role in its R&D and production developments, even having an "impact on our business decisions", according to Pichai.

More news: Special counsel indicts Russian, adds charges against Manafort

Although Google promised to stop working on Project Maven and publicly committed to abiding by the new AI principles, the company still seems interested in remaining a defense contractor and working with the military in "many other areas".

In addition to these ethical guidelines, Google published a starter's guide for making responsible AI, including testing for bias and understanding the limitations of the data used to train the algorithm. "This is the reality faced by any developers of what are usually called dual-use technologies".

Uphold high standards of scientific excellence. We will work to limit potentially harmful or abusive applications.

The restriction could help Google management defuse months of protest by thousands of employees against the company's work with the United States military to identify objects in drone video. America does not have a great track record when it comes to adhering to "widely accepted principles of worldwide law and human rights" or keeping its word.

More news: Danielle Wyatt Pumped Up as Arjun Tendulkar Gets India Under-19 Call

"While this means that we will not pursue certain types of government contracts", Greene wrote, "we want to assure our customers and partners that we are still doing everything we can within these guidelines to support our government, the military and our veterans". "These collaborations are important and we'll actively look for more ways to augment the critical work of these organisations and keep service members and civilians safe".

Comments