The Rise of Evil Photoshop

Photoshop isn’t always bad, I tell myself as I open CNN and notice that the latest news involve Trump appearing in front of a doctored Presidential seal featuring a Russian-styled two-headed eagle holding some golf clubs. Given it was for a conservative student summit, apparently it was all a mistake thanks to time crunch, but heads rolled and I laughed. Photoshop is a fun program. For many young designers, it’s usually their first introduction to Adobe Creative Suite — in design school, everyone knew how to use it. There are always funny Photoshopping events that run on Kotaku and other sites. That being said, it’s often used for evil or outright illegal purposes.

If you’ve been following the news, you might have seen that certain people thought it would be funny to make an app which auto-undresses women. Just women, mind you. If a guy’s picture was put into the app, it just added boobs to it. Naturally, the app promptly went viral, because there are a lot of evil bastards in the world. Via Vice:

“We created this project for users’ entertainment months ago,” he wrote in a statement attached to a tweet. “We thought we were selling a few sales every month in a controlled manner… We never thought it would become viral and we would not be able to control traffic.”

When I spoke to Alberto in an email Wednesday, he said that he had grappled with questions of morality and ethical use of this app. “Is this right? Can it hurt someone?” he said he asked himself. “I think that what you can do with DeepNude, you can do it very well with Photoshop (after a few hours of tutorial),” he said. If the technology is out there, he reasoned, someone would eventually create this.

Since then, according to the statement, he’s decided that he didn’t want to be the one responsible for this technology.

“We don’t want to make money this way,” the statement said. “Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones to sell it.” He claimed that he’s just a “technology enthusiast,” motivated by curiosity and a desire to learn. This is the same refrain the maker of deepfakes gave Motherboard in December 2017: that he was just a programmer with an interest in machine learning. But as the subsequent rise of fake revenge porn created using deepfakes illustrated, tinkering using women’s bodies is a damaging, sometimes life-destroying venture for the victims of “enthusiasts.”

“The world is not yet ready for DeepNude,” the statement concluded. But as these victimizing algorithms and apps show, there is no simple solution for technology like DeepNudes, and the societal attitudes that erase women’s bodily autonomy and consent.

There are a few takeaways from this. Firstly, I think it’s hilarious that Alberto thought he was creating an app “motivated by curiosity and a desire to learn” when it could destroy the lives of women with fake revenge porn. It’s either disingenuous — an “Uh oh, I now realize I could be sued for this!” or extremely ignorant, a “I’m so privileged I didn’t realize this might be a problem” thing. Secondly, it’s true that anyone with Photoshop could do what DeepNude does. If you pirate or buy Photoshop and if you invest time learning how to use the app (It WILL take more than a few hours), you can indeed create fake nude images that you could spread around to get someone fired / hurt them enough to drive them to depression or suicide / worse. Not to mention that anything that gets on the internet will likely stay on the internet.

What Can Victims of Evil Photoshop Do?

Depending on what the problem is, there might be a legal recourse. Many countries have a revenge porn law on the books. In Australia, there’s both a civil and criminal recourse:

“When we’ve spoken with people who’ve been individual victims of this type of behaviour — which is terrible behaviour, obviously — what they say is what they really want is some ability to really quickly compel that person to stop doing it and to compel people who’ve received it to take it down if it’s posted on Facebook or to remove it if it’s been texted or emailed to someone,” Porter told 6PR on Thursday afternoon.

“And so we’ve set up this civil penalty regime, which basically allows for this really quick take-down, if you like, of that type of posting. In addition to which, we’ve looked at the existing offences and we’ve toughened them up, so that it will now be the case that if you send an image which we have defined as a private sexual image of someone and you do that in a way that’s unreasonable — including, obviously, consideration as to whether or not the person consented — then you can face a penalty of five years.

“If you do that and you’ve been the subject of three or more of these civil penalty orders — which is the regime that sits underneath it — then you can be guilty of an offence with a penalty up to seven years.”

For everything else, things are more complicated. Deep fakes, or computer-generated replicas of a person saying or doing things they hadn’t said, already exist. In May 2018, a Belgian political party, Socialistische Partij Anders, or sp.a, created a deep fake video of Trump purportedly offering climate change advice to the people of Belgium. They’d assumed that the poor quality of the fake would alert people to the fact that it was a parody video. Naturally, it didn’t. The video went viral and sp.a went into damage control. If even a small-scale deep fake could damage our already fragile news systems, it could do worse. Via the Guardian:

Citron and Chesney are not alone in these fears. In April, the film director Jordan Peele and BuzzFeed released a deep fake of Barack Obama calling Trump a “total and complete dipshit” to raise awareness about how AI-generated synthetic media might be used to distort and manipulate reality. In September, three members of Congress sent a letter to the director of national intelligence, raising the alarm about how deep fakes could be harnessed by “disinformation campaigns in our elections”.

The specter of politically motivated deep fakes disrupting elections is at the top of Citron’s concerns. “What keeps me awake at night is a hypothetical scenario where, before the vote in Texas, someone releases a deep fake of Beto O’Rourke having sex with a prostitute, or something,” Citron told me. “Now, I know that this would be easily refutable, but if this drops the night before, you can’t debunk it before serious damage has spread.”

The problem, the article noted, wasn’t even whether people could or could not easily ID whether a video was fake, or if there was tech that could tell if it was fake:

Indeed, as the fake video of Trump that spread through social networks in Belgium earlier this year demonstrated, deep fakes don’t need to be undetectable or even convincing to be believed and do damage. It is possible that the greatest threat posed by deep fakes lies not in the fake content itself, but in the mere possibility of their existence.

This is a phenomenon that scholar Aviv Ovadya has called “reality apathy”, whereby constant contact with misinformation compels people to stop trusting what they see and hear. In other words, the greatest threat isn’t that people will be deceived, but that they will come to regard everything as deception.

Recent polls indicate that trust in major institutions and the media is dropping. The proliferation of deep fakes, Ovadya says, is likely to exacerbate this trend.

In other words, no, there’s nothing much we can do to prevent the increasing sophistication of evil Photoshop and its kin. It’s all very well to ask people to apply an extra-healthy dose of cynicism and scepticism to anything that appears too good to be true, but in today’s time-sensitive world, that’s a hard ask.

A Retouching Law

Altering images to remove small flaws is nothing new. We do it ourself — retouching images to make them more beautiful. I’ve worked before in a studio with mostly fashion clients where they’d regularly lengthen the legs, lengthen the neck, change the hair, the nails — all in post-production. The “It can all be fixed in post” attitude is filtered in from design school. A classmate once told me in third year that while in first year he would have reshot an image with an error in the set, in third year it was already just easier to erase it in post rather than fiddle with cameras and lighting.

When retouching is taken to extremes, however, that’s what I’d call Strangely Socially Acceptable Evil Photoshop. Look at the image above. It’s of the same model: Filippa Hamilton. See the problem? The image on the left drew criticism over its alteration to impossible body standards.

The Rise of Evil Photoshop

Such standards cause lasting damage or worse on people. In 2007, Hila Elmaliach, a well-known model, died of complications from anorexia at age 34. She was 5’8, and weighed less than 22kg at her death. She’d developed eating disorders when she’d become a model at 13 years old. The American Medical Association also released a statement in 2011 about image alteration:

“The appearance of advertisements with extremely altered models can create unrealistic expectations of appropriate body image. In one image, a model’s waist was slimmed so severely, her head appeared to be wider than her waist. We must stop exposing impressionable children and teenagers to advertisements portraying models with body types only attainable with the help of photo editing software.”

Israel has now passed legislation requiring models to have a BMI of at least 18.5, and there’s legislation in the books about retouching in some countries. Via Pixelz:

In France, a law that went into effect in October of 2017 requires a “photographie retouchée” label on photos that have been digitally altered to make a model’s silhouette narrower or wider; it also requires an every other year health exam for models, to medically certify they are healthy enough to work.

Getty images has now banned “any creative content depicting models whose body shapes have been retouched to make them look thinner or larger,” and there’s also been celebrity backlash on the issue:

The Things We Do

People have less and less of an appetite for extremely retouched stuff. American Eagle’s lingerie line, Aerie, had a campaign celebrating women of all body types:

It resulted in a 20% increase in sales in 2015. And of course, there’s Dove’s heavy push towards body positivity, in award-winning ads that have raised the profile of its products. So how much retouching (if any) is OK, and how much is not? In our opinion, retouching minor things like dirt, temporary blemishes, lighting and such is fine. These edits don’t fundamentally change what the image is, they just change the basic presentation. Respect the model and respect your audience, and as with every ad/marketing content out there, create it while trying to do no evil. That’s the best advice we can put forward in this day and age. Want to know more? Get in touch.


Image by Alex Wong / Getty Images / as seen on CNN

Scroll to Top