Another Growing Menace Is Deepfake Porn After AI Race

Technology will continue to advance and will become as simple as pressing a button.

Deepfake

Deepfake

Artificial intelligence imagery can be used to make art, use virtual dressing rooms to test clothing and assist in marketing campaign planning. But experts worry that the less desirable aspects of the more widely available tools could create a problem that mostly impacts women. known as nonconsensual, deepfake pornography.

About Deepfake

Deepfakes are digitally produced or changed films and images that use machine learning or artificial intelligence. A Reddit user published films of porn actors carrying the faces of famous women several years ago, which marked the beginning of technologically produced porn’s internet-wide dissemination.
Since then, deep fake producers have distributed comparable movies and photos that are directed towards journalists, online influencers, and other people with a high public profile. Several websites host thousands of movies, and some of them give users the option to create their own photos, basically enabling anyone to convert any person into the object of their sexual fantasies without their knowledge or to use the technology to damage forum partners.

According to experts, the issue developed as it became simpler to create complex deep fakes. However, they warn that things might get worse when generative artificial intelligence (AI) systems evolve that can generate new content based on existing data after being trained on billions of online photos.

Adam Dodge on Deepfake

According to Adam Dodge, the founder of EndTAB, a group that offers training on technology-enabled abuse, the truth is that technology will continue to advance and will become as simple as pressing a button. People will continue to abuse technology to hurt others as long as that persists, most commonly through online sexual violence, deep-fake pornography, and fake nude photographs.

Also read: Chat-GPT’s New Avatar As Visual-GPT

Norelle Martin

That is something that Norelle Martin of Perth, Australia, has experienced. The 28-year-old had established fake pornography of herself 10 years prior when wondering for no specific reason one day, she utilised Google to look through a picture of herself. Martin maintains to this day that she is unaware of the identity of the fake photographs and videos of her sexual encounters.