Bumble is launching a "private detector" feature that can automatically detect lewd images with AI and warn users about the photo before they open it. Users can decide if they want to see, block or report the image to the moderators. The feature is part of a security initiative by Bumble's co-founders, and will also come to the Badoo, Chappy and Lumen applications, which are part of the same parent company of the dating group, starting in June.
Ace One of the few dating apps that allow photos to be sent in chat, Bumble already has measures in place to protect users by blurring all images by default. The recipients must keep the photo pressed to see it, which will then show the photo with a watermark of the image of the sender's profile. The idea was that photos attached to a sender's profile should stop unwanted lascivious images. However, as users have experienced, there has not been much to prevent someone from making false profiles. For example, do not be like "James, 23", below.
Now, lewd photo messages will at least include a warning that AI has detected (with 98 percent accuracy, the company claims) potentially inappropriate content.
In addition to the new feature, Bumble's CEO and co-founder, Whitney Wolfe Herd, has been working with Texas lawmakers to pass a bill that would make exchanging images of unwanted nudity a crime, with a punishable fine of up to $ 500. The bill was drafted by the Republican representative of the State of Texas, Morgan Meyer, on the grounds that, just as it is illegal to expose oneself on the streets in public, it should be illegal to do the same in line. "Something that is already a crime in the real world has to be a crime online," said Meyer  .