Every morning, after I feed the cats, make breakfast, I open my MacBook and curse myself. “Twitter,” I say, “what’s the worst thing that happened while I was asleep?”
YouTube’s algorithm has been curating home movies of unwitting families into a catalog of semi-nude kids, we found.
YT often plays the videos after users watch softcore porn, building an audience of millions for what experts call child sexual exploitationhttps://t.co/zNwsd9UsgN
— Max Fisher (@Max_Fisher) June 3, 2019
This problem with YouTube has been brewing for some time. It’s easily correctable on YouTube’s part — they just need to turn off the Recommended feature for videos involving kids, which their algorithm can already pick out. However, since that feature is the main driver of ad dollars to videos and to YouTube, they won’t. Yet another reminder that Google, YouTube’s parent company, left its ‘Do No Evil’ corporate philosophy behind a long time ago.
Sounds familiar? You’re probably a news junkie. The realization that predators have been using the YouTube comment section of home videos to guide other predators to the videos was previously written about a few months ago. Advertisers like started pulling (or considering pulling) their ads:
Nestlé, Epic Games and other major brands said on Wednesday that they had stopped buying advertisements on YouTube after their ads appeared on children’s videos where pedophiles had infiltrated the comment sections.
The companies acted after a YouTube user posted a video this week to point out this behavior. For the most part, the videos targeted by pedophiles did not violate YouTube’s rules and were innocent enough — young girls doing gymnastics, playing Twister or stretching — but the videos became overrun with suggestive remarks directed at the children.
The YouTube user, Matt Watson, came under criticism for having started an “adpocalypse” — a plunge in ad revenue — by publicising the issue instead of reporting the matter to YouTube’s own tools, even though combing through the thousands of home videos to manually report every comment was going to be a gargantuan task for a human. Months later, with the problem still unchecked and YouTube still intent on keeping its moneymaking algorithm despite the risks, it’s hard to see whether Matt’s approach had really worked — or if there was anything else he could have done.
Pulling Ads — The Beginning
Companies have often dropped ads or sponsorship for something they considered toxic. This hasn’t always worked. Floyd Mayweather, the boxer, remained one of the highest paid by revenue sports people in the world even without sponsorships. Nike was also recently revealed to have dropped sportswomen from its sponsorship after they got pregnant, and only committed to ending the penalties after an op-ed on the NYT brought backlash on the brand.
The most recent and well-known example of weaponising the process as a form of political protest came, perhaps unsurprisingly, from a freelance advertising copywriter from San Francisco. Matt Rivitz was incensed after the 2016 elections. In particular, he was angry with Steve Bannon of Breitbart news, an American conservative propaganda “news” site. He created an anonymous Twitter account, which he called Sleeping Giants. It encouraged people to take screenshots of ads appearing on Breitbart and forward them to brands, which were taken aback by images of their ads appearing next to headlines like “Birth Control Makes Women Unattractive and Crazy”. Hundreds of brands responded by blacklisting the site, and Sleeping Giants added Fox News to its purview, managing to remain anonymous until Matt was unmasked against his wishes by the Daily Caller, another conservative site. Via the New York Times:
“The way it happened sucks, but I’m super proud of this thing and of all the people who worked on it and all the people who followed it,” Mr. Rivitz said in his first interview since his involvement in the account was revealed. “We’re happy that we made advertisers think a little bit and realize what they’re supporting.” […]
“I was pretty amazed at the stuff they were printing, and my next thought, being in advertising, was, ‘Who is knowingly supporting this stuff?’” he said. “I thought maybe it would be two to three companies, and I quickly realized within a couple hours it was all placed programmatically.”
Mr. Rivitz was referring to the automated systems that place most online ads and tend to target consumers based on who they are, rather than which site they are visiting.
Sleeping Giants intends to make “bigotry and sexism less profitable”.
“People can use their free speech to say whatever they want and print whatever they want, and that’s what makes this country great,” Mr. Rivitz said, “but it doesn’t mean they need to get paid for it, especially by an advertiser who didn’t know they were paying for it.”
Careful Where You Advertise
It’s easy to pull ads from one particular site, but working on a blacklist system can mean that your ad will eventually just reappear on another site that doesn’t align with your brand values. Instead of trying to blacklist sites, it’s probably better for you to work from a trusted set of whitelisted sites, manually verified. You can build a nice whitelist just by checking out the most visited sites on the Internet and working from there.
Pulling ads from entire platforms is another kettle of fish. Not running ads on major platforms like YouTube as a form of protest isn’t something that a lot of brands can afford to do as part of their digital strategy. After all, YouTube’s owned by Google — do you then stop buying AdWords as well? Do you stop buying ads on Facebook and Instagram for all the dodgy things that Facebook has done? With the amount of noise out there, small brands with less clout in the marketplace will have to consider their digital strategies more closely than major brands who can generate news just from the act of boycotting YouTube.
Need more information? Need help with your digital placement strategy? Get in touch.
Image from Bizztor.