Fake Porn Videos Removed from the Internet

Porn videos that used new software to fake the original faces of famous actors and actresses with that of celebrities are now being removed by Gfycat, a service popularly known to host much of these type of content.

In a brief statement, the San Francisco-based company said, “Our terms of service allow us to remove content that we find objectionable. We are actively removing this content”.

With the proliferation of mobile devices and apps, the number of users who know where and how to find Deepfake pornographic videos appear to have steadily been on the increase. This has led to the creation of such videos becoming more common particularly after the release of a free tool which made the creative process a lot easier.

According to the BBC, the developer says FakeApp has been downloaded more than 100,000 times.

It works by using a machine-learning algorithm to create a computer-generated version of the subject’s face.

To do this it requires several hundred photos of the celebrity in question for analysis and a video clip of the person whose features are to be replaced.

The results – known as deepfakes – differ in quality. But in some cases, where the two people involved are of a similar build, they can be quite convincing. By using reversion pornography, some users tended to feature popular female actresses and singers before uploading their clips on Gfycat and sharing on social networking sites including Reddit.

While many users have used the technology to create non-pornographic content, like everything else, it had been prone to abuse, leading to this decision.

Previous post Apple HomePod Features: What to Expect
Next post Wii U vs Switch: Why the Switch is Outperforming the Wii U

Leave a Reply

Your email address will not be published. Required fields are marked *