Google Content Safety API
Jump to navigation
Jump to search
A Google Content Safety API is a content safety API that is a Google Cloud Computing Service.
- See: Microsoft Artemis, CSAM Content.
References
2018
- https://www.blog.google/around-the-globe/google-europe/using-ai-help-organizations-detect-and-report-child-sexual-abuse-material-online/
- QUOTE: ... Today we’re introducing the next step in this fight: cutting-edge artificial intelligence (AI) that significantly advances our existing technologies to dramatically improve how service providers, NGOs, and other technology companies review this content at scale. By using deep neural networks for image processing, we can now assist reviewers sorting through many images by prioritizing the most likely CSAM content for review. While historical approaches to finding this content have relied exclusively on matching against hashes of known CSAM, the classifier keeps up with offenders by also targeting content that has not been previously confirmed as CSAM. Quick identification of new images means that children who are being sexually abused today are much more likely to be identified and protected from further abuse.
We’re making this available for free to NGOs and industry partners via our Content Safety API, a toolkit to increase the capacity to review content in a way that requires fewer people to be exposed to it. …
- QUOTE: ... Today we’re introducing the next step in this fight: cutting-edge artificial intelligence (AI) that significantly advances our existing technologies to dramatically improve how service providers, NGOs, and other technology companies review this content at scale. By using deep neural networks for image processing, we can now assist reviewers sorting through many images by prioritizing the most likely CSAM content for review. While historical approaches to finding this content have relied exclusively on matching against hashes of known CSAM, the classifier keeps up with offenders by also targeting content that has not been previously confirmed as CSAM. Quick identification of new images means that children who are being sexually abused today are much more likely to be identified and protected from further abuse.