New Delhi: The menace of deep fake pictures and videos is getting bigger and worrying. The latest revelation by an international cybersecurity firm will leave you shocked. Sensity’s researchers have found a “deepfake ecosystem” on the encrypted messaging app –Telegram, which is centered around AI-powered bots and can generate fake nudes on request.
The security firm claims that over one lakh women have been targeted and their personal “stripped” images have been shared publicly by the end of July 2020.
According to cyber experts, these stripped images can be misused by sharing it in private or public channels beyond Telegram as part of public shaming or extortion-based attacks.
Researchers say that the people are using these bots to mainly create nudes of women they know. They are copying images of their target from social media and after converting them it in nudes they then share and trade with one another in various Telegram channels. The software is used to generate these images is known as DeepNude.
To “strip” an image, a user simply needs to upload a photo of a target to the bot and receive the processed image after a short generation process.
There are various other similar underground tools but what is worrying about this bot service is that it easy to use and accessible. It comes with a simple user interface that functions on mobile phones as well as computers.
These bots are free to use, but they create fake nudes with watermarks or only partial nudity. However, users can pay it more to “uncover” the pictures completely.
“The number of these images grew by 198% in the last three months until July. Self-reporting by the bot’s users indicated that 70% of targets are private individuals whose photos are either taken from social media accounts or private material,” Sensity said in its key findings.
The finding also shows that the bot and its affiliated channels have so far got around over a lakh member worldwide. A maximum of 70 per cent is from Russia and ex-USSR countries.
The misuse of Deepfakes is becoming a big concerning as it allows to manipulate or fabricate visual and audio content on the internet to make it seem very real. These software are quite similar to face animation techniques used in movies.