“Deepfake” makers are developing alarmingly lifelike videos from computer using images available on Web, and ordinary ladies are facing the damage.

Photoshop and Airbrushing paved way for easy manipulation of images long time ago. Now, videos are following in their shoes. The realistic “deepfake” videos are charged by widely available and powerful Google’s artificial intelligence software, and are getting viral throughout the Internet, clouding the truth with lie.

However, the videos are the latest and shameful means of humiliating, abusing and harassing women. Fake videos are uploaded on famous porn sites and are explicitly detailed, making it extremely difficult to detect. And even though their validity hasn’t been confirmed in court, scholars say they can be secured by the First Amendment – although they might also come under the category of identity theft, defamation or fraud.

The alarming lifelike replicas have been developed using the faces of both women and celebrities who are not really popular, and Scarlett Johansson, an actress, said that “it’s just a matter of time before any one person is targeted” by a vivid forgery.

Johansson has been overlaid on several graphic sex scenes in the previous year that have spread throughout the Web: One such video, wrongly states as real “leaked” video, has been viewed for over 1.5 million times on a popular porn site. She claimed that she fears it may be already too late for children and women to save themselves from the “virtually lawless (online) abyss.”

“Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired,” she claimed. “The fact is that trying to protect yourself from the Internet and its depravity is basically a lost cause. . . . The Internet is a vast wormhole of darkness that eats itself.”

Google did add an “involuntary synthetic pornographic imagery” to its ban list in September, permitting users to ask the search engine block results that wrongly show them as “nude or in a  sexually explicit situation.” However, it is not easy to fix their development and spread.

Latest innovations in machine learning tech, hired by makers wanting to perfect and improve their replica videos, have made their development easier than ever. Only a computer and collection of images which are already posted daily on social media are needed to make the fakes.

Anita Sarkeesian, a media critic, has also been a victim and her fake video had been viewed for over 30,000 times this year. She said the deepfakes were more proof of “how terrible and awful it is to be a woman on the Internet, where a entitled to women’s bodies.”

“For folks who don’t have a high profile, or don’t have any profile at all, this can hurt your job prospects, your interpersonal relationships, your reputation, your mental health,” Sarkeesian said. “It’s used as a weapon to silence women, degrade women, show power over women, reducing us to sex objects. This isn’t just a fun-and-games thing. This can destroy lives.”