The amalgamation of deep learning and fake, deepfake technology lets people create content where the face of a person superimposes on the other person’s body for realistic-looking fake visuals.
This AI-driven technology did more harm than good. Recently, cybersecurity company Deeptrance, revealed that 15,000 deepfake videos were uploaded daily. 96% were clear pornography in which online predators morphed celeb faces onto the adult entertainers. It is not just limited to celeb porn, social predators make deepfake videos of ordinary people also for pure revenge.
A woman named Novena Mitra who experiences image-based abuse online stated that,
“I had spent my entire adult life watching helplessly as my image was used against me by men that I had never permitted to of any kind.”
Well, she is just one among thousands of ordinary women who are being preyed upon in these mass-scale horrific online cultures that are dedicated to sexually exploiting and morphed ordinary women’s photos into porn.
Mostly, these deepfakes and other disinformation against women are spread to silence and humiliate them. In 2018, Indian journalist Rana Ayyub revealed she becomes the victim of deepfake after she demands justice for Kathua gang-rape victim.
The problem with online crime is that the law doesn’t progress as fast as technology. Currently, the tech laws in most nations are not robust to address the issues which arise out of AI algorithms. There are no such laws related to deepfake pornography in countries.
However, revenge pornography is illegal for many years. But in developed nations, the USA, the Deepfake Accountability Act 2019 was passed before the 2020 elections to make revenge law effective.
AI technology is improving rapidly, but in the end, it is fake, so it still can’t be perfect. Some signs like unnatural eye movement, weird lip-synching, misalignment of body and face, awkward facial-feature positioning, and much more.
I understand that this news can be depressing and scary but there are still some ways to protect you from being a deepfake victim:
If we don’t take action on deepfakes, there’ll be a time when women are traded and their voices are silenced permanently.