24.10.2022 21:36 Uhr, Quelle: Engadget

Bumble open-sourced its AI tool for catching unwanted nudes

Since 2019, Bumble has used machine learning to protect its users from lewd photos. Dubbed Private Detector, the feature screens images sent from matches to determine if they depict inappropriate content. It was primarily designed to catch unsolicited nude photos, but can also flag shirtless selfies and images of guns – both of which aren’t allowed on Bumble. When there’s a positive match, the app will blur the offending image, allowing you to decide if you want to view it, block it or report the person who sent it to you.In a recent blog post, Bumble announced it was open-sourcing Private Detector, making the framework available on Github. “It’s our hope that the feature will be adopted by the wider tech community as we work in tandem to make the internet a safer place,” the company said, in the process acknowledging that it’s only one of many players in the online dating market.Unwanted sexual advances are a frequent reality for many women both online and in the rea

Weiterlesen bei Engadget

Digg del.icio.us Facebook email MySpace Technorati Twitter

JustMac.info © Thomas Lohner - Impressum - Datenschutz