Google Nearly released 100k Health Files Without Consent

$0.05 tipped


Google almost released 100k chest X-ray files, as part of an Artificial Intelligence project.  The National Institutes of Health stepped in and advised of the potential violation which halted the exposure of data. 

This is another example of how the race to bring AI products to market is putting in jeopardy privacy and security.  Google is no worse than any other developer (and in this case moved quickly to end the project and avoid releasing sensitive health data).  It is happening across the market.  It exemplifies how we are deficient as an industry in establishing broadly accepted ethical AI standards that includes sampling for representative data, proper security controls, checks for potential impacts to life-safety, and especially the generation or exposure of data that can undermine privacy. 

AI is a tremendously powerful tool that can bring unimaginable wonders to the world, but it is a tool that can also cause harm and subvert people’s rights.  It is important that we recognize the risks and move in unison to establish AI Ethics standards.

Matthew Rosenquist
Matthew Rosenquist

Cybersecurity Strategist specializing in the evolution of threats, opportunities, and risks in pursuit of optimal security for our digital world.

Cybersecurity Tomorrow
Cybersecurity Tomorrow

Cybersecurity strategy perspectives for the emerging risks and opportunities of securing our digital world. The insights of today will lead to tomorrow's security, privacy, and safety foundations.

Send a $0.01 microtip in crypto to the author, and earn yourself as you read!

20% to author / 80% to me.
We pay the tips from our rewards pool.