The European Union (EU) has agreed to criminalise cyber violence, including explicit deepfake images and videos made without consent. The legislation will also include sending unsolicited nudes.
The new rules will include images and videos made using AI tools and hope to prevent revenge porn, cyber-stalking, online harassment, misogynous hate speech and “cyber flashing”.
The EU Commission wants to criminalise these acts across the entire 27-country bloc. This means that it will be illegal even in countries that haven’t yet created legislation to cover these crimes.
“This is an urgent issue to address, given the exponential spread and dramatic impact of violence online,” says the announcement. The directive covers physical violence as well and is aimed overall at reducing violence against women in Europe.
The directive will require member states to provide an online portal where victims can easily report offences. They have until 2027 to implement these requirements.
Just last week, Taylor Swift became the latest victim of deepfake pornographic images. The explicit images were shared 47 million times on popular social media platforms before they were finally taken down.
“The latest disgusting way of humiliating women is by sharing intimate images generated by AI in a couple of minutes by anybody,” European Commission Vice President Věra Jourová told Politico in response to the new legislation.
“Such pictures can do huge harm, not only to popstars but to every woman who would have to prove at work or at home that it was a deepfake,” she added.
[via engadget]