New rules from Microsoft ban use of AI for facial recognition by law enforcement

Microsoft continues to stand firm in its stance against law enforcement using its Azure OpenAI service Generative Artificial Intelligence (AI) which uses facial recognition, has joined other tech giants like Amazon and IBM in similar decisions.

Washington based tech giant Revised Terms of Service Its Azure OpenAI offering explicitly restricts its use ‘by or for’ police departments for facial recognition in the US.

The use of “real-time facial recognition technology” on mobile cameras used by any law enforcement globally to attempt to identify an individual is also now explicitly prohibited. [sic] In uncontrolled, ‘in the wild’ environments, in which police on patrol use body-worn or dash-mounted cameras using facial recognition technology to attempt to identify individuals who are in a database of suspects or former prisoners. Including (without limitation) officers.

The company has since claimed that its original change to the terms of service was an error. They told techcrunch The ban only applies to facial recognition in the US, rather than a blanket ban on police departments using the service.

Why has Microsoft banned facial recognition with its Generative AI service?

This update to Microsoft’s Azure OpenAI terms of service comes just a week after Axon, a military technology and weapons company, announced a tool built using OpenAI’s GPT-4 to summarize body camera audio. Has been done.

Tasks using generative AI in this way have several disadvantages, such as the tendency of these tools to ‘hallucine’ and make false claims (OpenAI is currently subject to a privacy complaint due to its failure to correct inaccurate data from ChatGPT), and the rampant racial bias present in facial recognition due to racist training data (such as late last year) False facial recognition resulting in wrongful imprisonment of an innocent black man).

These recent changes reinforce the stance that Microsoft has maintained for many years. During the Black Lives Matter protests in 2020, Speaking to The Washington PostMicrosoft President Brad Smith said, “We will not sell facial-recognition technology to police departments in the United States until we have a national law, based on human rights, that would regulate this technology.”

The current wave of protests around the world over the massacre of Palestinians in Gaza has prompted a renewed commitment by tech companies to protect human rights, as issues of police brutality towards protesters continue to emerge in the press.

Featured Image Credit: Generated with Ideogram