Is YouTube doing enough to ban discussion we do not like? - 60 Minutes - CBS News

What this 60 Min piece is doing is setting up the ‘public pressure and support’ for YouTube on Dec 10, when it will announce a new ToS, and also ban Stephen Crowder, Paul Joseph Watson, and others associated with ‘Infowars’. Get ready, it will happen. Timing relative to Dec 9 is telling.

To grasp the phenomenal scale of YouTube: consider that people spend 1 billion hours watching videos on it, every day. It is the most used social network in the U.S. More queries are typed into the website’s search bar than anywhere online except Google, which owns YouTube.

But the site has come under increasing scrutiny, accused of propagating white supremacy, peddling conspiracies and profiting from it all. They recently agreed to pay a record $170 million dollars to settle allegations that they targeted children with ads. YouTube is being forced to concentrate on cleansing the site.

We visited the company’s headquarters in San Bruno, California, to meet Susan Wojcicki, the 51-year-old CEO in charge of nurturing the site’s creativity, taming the hate and handling the chaos.

Susan Wojcicki: We have 500 hours of video uploaded every single minute to YouTube.

Lesley Stahl: Fi-- say that again.

Susan Wojcicki: So we have 500 hours of video uploaded every minute to YouTube.

Lesley Stahl: That is breathtaking.

Susan Wojcicki: It, it is, it is. We have a lot of video.

And a lot of influence on our lives, and how we pass our time.

Over a billion people listen to music on YouTube every month: it’s the planet’s top music site. There’s a children’s channel; with over 44 billion views.

Lesley Stahl: Do you let your children watch YouTube, including the young ones?

Susan Wojcicki: So I allow my younger kids to use YouTube Kids, but I limit the amount of time that they’re on it. I think too much of anything is not a good thing. But there’s a lot you can learn on YouTube. I think about how YouTube in many ways is this global library. You wanna see any historical speech, you could see it. You want to be able to learn a language–

Lesley Stahl: Make a soufflé?

Susan Wojcicki: --wanna laugh, you just wanna see something funny. A soufflé! Oh, yeah, cooking. Cooking’s a great example.

So’s watching people binge eat. A growing number of American adults are turning to it for their news, sports, medical information. It’s now mankind’s largest “how to” collection: how to tie a tie, tie the knot, or speak Thai.

The site has produced whole new pastimes where millions watch strangers open boxes, whisper, sleep. YouTube’s artificial intelligence algorithms keep recommending new videos so users watch more and more and more.

Wojcicki invited us to the weekly all-staff meeting. She’s surprisingly down-to-earth for one of the most powerful people in Silicon Valley, where her trajectory started in an unlikely way.

Susan Wojcicki: I owned a garage. And I was worried about covering the mortgage. So I was willing to rent my garage to any student. But then two students appeared. One was named Sergey Brin. The other was named Larry Page. They are the founders of Google.
Lesley Stahl: Yes, they are.

Susan Wojcicki: But at the time they were just students. They looked like any other students.

Larry and Sergey ended up hiring her as their first marketing manager: she was Google employee 16. As the company grew, so did her role and so did her family. She has five children. Google bought YouTube on her recommendation, for over $1.6 billion, and eight years later she became CEO, with a mandate to make it grow and make it profitable. And she did. It’s estimated worth is $160 billion.

YouTube makes most of its money from ads, splitting revenue with people who create all kinds of videos. From do-it-yourself lessons to hip-hop lessons. The more popular ones can become multimillion dollar entrepreneurs.

YouTube also makes money from political ads, a thorny issue because some of them have been used to spread lies on social media.

Lesley Stahl: Facebook is facing a lot of controversy because it refuses to take down a President Trump ad about Biden which is not true. Would you run that ad?

Susan Wojcicki: So that is an ad that, um, right now would not be a violation of our policies.

Lesley Stahl: Is it on YouTube right now?

Susan Wojcicki: It has been on YouTube.

Lesley Stahl: Can a politician lie on YouTube?

Susan Wojcicki: For every single video I think it’s really important to look at it. Politicians are always accusing their opponents of lying. That said, it’s not okay to have technically manipulated content that would be misleading. For example, there was a video uploaded of Nancy Pelosi. It was slowed down just enough that it was unclear whether or not she was in her full capacity because she was speaking in a slower voice.

Susan Wojcicki: The title of the video actually said drunk, had that in the title. And we removed that video.

Lesley Stahl: How fast did you remove it?

Susan Wojcicki: Very fast.

But not completely. We just did a search and there it was still available. The company keeps trying to erase the purported name of the impeachment whistleblower, but that too is still there. Which raises doubts about their system’s ability to cleanse the site.

In the 2016 election cycle, YouTube failed to detect Russian trolls, who posted over 1,100 videos, almost all meant to influence African-Americans, like this video.

Video: Please don’t vote for Hillary Clinton. She’s not our candidate. She’s a f—ing old racist b----.

YouTube is an “open platform” meaning anyone can upload a video, and so the site has been used to spread disinformation, vile conspiracies, and hate. This past March, a white supremacist livestreamed his killing of dozens of Muslims in Christchurch, New Zealand. He used Facebook, but for the next 24 hours copies of that footage were uploaded on YouTube tens of thousands of times.

Susan Wojcicki: This event was unique because it was really a made-for-Internet type of crisis. Every second there was a new upload. And so our teams around the world were working on this to remove this content. We had just never seen such a huge volume.

Lesley Stahl: I can only imagine when you became CEO of YouTube that you thought, "Oh, this is gonna be so fun. It’s "people are uploading wonderful things like–

Susan Wojcicki: Funny cat videos.

Lesley Stahl: --funny. And look at what we’re talking about here. Are you worried that these dark things are beginning to define YouTube?

Susan Wojcicki: I think it’s incredibly important that we have a responsibility framework, and that has been my number one priority. We’re removing content that violates our policies. We removed, just in the last quarter, 9 million videos.

Lesley Stahl: You recently tightened your policy on hate speech.

Susan Wojcicki: Uh-huh.

Lesley Stahl: Why-- why’d you wait so long?

Susan Wojcicki: Well, we have had hate policies since the very beginning of YouTube. And we–

Lesley Stahl: But pretty ineffective.

Susan Wojcicki: What we really had to do was tighten our enforcement of that to make sure we were catching everything and we use a combination of people and machines. So Google as a whole has about 10,000 people that are focused on controversial content.

Lesley Stahl: I’m told that it is very stressful to be looking at these questionable videos all the time. And that there’s actually counselors to make sure that there aren’t mental problems with the people who are doing this work. Is that true?

Susan Wojcicki: It’s a very important area for us. We try to do everything we can to make sure that this is a good work environment. Our reviewers work five hours of the eight hours reviewing videos. They have the opportunity to take a break whenever they want.

Lesley Stahl: I also heard that these monitors, reviewers, sometimes, they’re beginning to buy the conspiracy theories.

Susan Wojcicki: I’ve definitely heard about that. And we work really hard with all of our reviewers to make sure that, you know, we’re providing the right services for them.

Susan Wojcicki showed us two examples of how hard it is to determine what’s too hateful or violent to stay on the site.

Susan Wojcicki: So this is a really hard video to watch.

Lesley Stahl: Really hard.

Susan Wojcicki: And as you can see, these are prisoners in Syria. So you could look at it and say, “Well, should this be removed, because it shows violence, it’s graphic.” But it’s actually uploaded by a group that is trying to expose the violence.

So she left it up. Then she showed us this World War II video.

Lesley Stahl: I mean it’s totally historical footage that you would see on the History Channel.

But she took it down.

Lesley Stahl: Why?

Susan Wojcicki: There is this word down here that you’ll see, 1418.

1418 is code used by white supremacists to identify one another.

Susan Wojcicki: For every area we work with experts, and we know all the hand signals, the messaging, the flags, the songs, and so there’s quite a lot of context that goes into every single video to be able to under- stand what are they really trying to say with this video.

The struggle for Wojcicki is policing the site, while keeping YouTube an open platform.

Susan Wojcicki: You can go too far and that can become censorship. And so we have been working really hard to figure out what’s the right way to balance responsibility with freedom of speech.

But the private sector is not legally beholden to the First Amendment.

Lesley Stahl: You’re not operating under some-- freedom of speech mandate. You get to pick.

Susan Wojcicki: We do. But we think there’s a lot of benefit from being able to hear from groups and underrepresented groups that otherwise we never would have heard from.

But that means hearing from people with odious messages about gays, women and immigrants.

Wojcicki explained that videos are allowed as long as they don’t cause harm: but her definition of “harm” can seem narrow.

Susan Wojcicki: So if you’re saying, “Don’t hire somebody because of their race,” that’s discrimination. And so that would be an example of something that would be a violation against our policies.

Lesley Stahl: But if you just said, “White people are superior” by itself, that’s okay.

Susan Wojcicki: And nothing else, yes.

But that is harmful in that it gives white extremists a platform to indoctrinate.

And what about medical quackery on the site? Like turmeric can reverse cancer; bleach cures autism; vaccines cause autism.

Once you watch one of these, YouTube’s algorithms might recommend you watch similar content. But no matter how harmful or untruthful, YouTube can’t be held liable for any content, due to a legal protection called Section 230.

Lesley Stahl: The law under 230 does not hold you responsible for user-generated content. But in that you recommend things, sometimes 1,000 times, sometimes 5,000 times, shouldn’t you be held responsible for that material, because you recommend it?

Susan Wojcicki: Well, our systems wouldn’t work without recommending. And so if–

Lesley Stahl: I’m not saying don’t recommend. I’m just saying be responsible for when you recommend so many times.

Susan Wojcicki: If we were held liable for every single piece of content that we recommended, we would have to review it. That would mean there’d be a much smaller set of information that people would be finding. Much, much smaller.

She told us that earlier this year, YouTube started re-programming its algorithms in the U.S. to recommend questionable videos much less and point users who search for that kind of material to authoritative sources, like news clips. With these changes Wojcicki says they have cut down the amount of time Americans watch controversial content by 70%.

Lesley Stahl: Would you be able to say to the public: we are confident we can police our site?

Susan Wojcicki: YouTube is always going to be different than something like traditional media where every single piece of content is produced and reviewed. We have an open platform. But I know that I can make it better. And that’s why I’m here.

1 Like

Interesting. Youtube is first using the banning of conservatives, alt-right and objectionable voices as a means to justify the eventual and later banning of all channels that are no longer deemed advertiser friendly.

Youtube is going to try to be a format similar to Netflix in some regard. This hate speech banning is merely an excuse to do it slowly and with some justification that will not trigger backlash from the normies.

The normies will be the next wave after that. Youtube simply obtained a monopoly on a platform anyone could use for free for years and now no longer wants to lose money providing bandwidth and storage for those who do not bring in the shekels.

This hate speech ploy is simply stage one of banning all the channels that will not bring youtube shekels. Audio interviews and music, with no interesting video content that keeps viewers glued to the screen to watch ads, will also be purged.

This isn’t about hate speech. This is about youtube changing its corporate direction and platform to a more profit driven model, but to do so slowly that it does not cause mass protest, exodus or public response by all the small people that actually made youtube what it is today.

Second reason for this is because it is now one year out before the next presidential election and conservative and pro-Trump voices must be silenced to gain an advantage.

1 Like

“Hate Speech” Anything not in line with “our” narrative.

The network that spiked the Epstein story wants to lecture us about social responsibility.

1 Like

Good, we really shouldn’t be providing platforms for nut job conspiracy theorists to peddle their crap.

Who is “we” Monte?

Who gets to define conspiracy and who gets ruled out of the discussion? If you are truly supportive of this type of suppression of speech based upon political views then you should pack your bags and head to GITMO where you belong.

Remember when @montecresto1 was against all forms of partisanship? How things have changed.

I’m so sick and damn tired of YouTube shutting everybody down because of speech that some spineless little lefty out in California doesn’t like. We know that they shut Christian videos down and we know that they shut conservative voices down. They’ve been all of these things as hate speech which is wrong and offensive. Then when you criticize YouTube they say you are committing an act of hate speech and because all of the leftists agree with them they come after whoever is making the criticism. we need to have a better way to share videos than YouTube!

I’ve been using MEGA lately. It’s good because you can store all of your files in MEGA and share links to videos privately or publicly. You can also embed the videos just about anywhere.

Here is an example from my (soon to be banned from YouTube) library:

1 Like

You got me thinking now. That’s a smart idea because if you download the video before it gets banned and upload it to your personal library then it would be really hard for someone to take it down. I’m sure the powers that be could complain to mega but if I remember right that’s the company that Kim Dot Com owns and he clearly doesn’t give a crap about what anyone thinks. I don’t agree with all of his politics but he has been 100% all about free speech on the internet since the beginning.

Video quality on that is really good too.

1 Like

Did you guys see the video of her banning that history channel because she thought “1418” was a white supremacist slogan :rofl:

1418 Heil Victory!


The bigger question is, what happened in 1418 that they don’t want us to know about?

1 Like

Here is a clip of that skeleton Lesley Stahl shilling for YouTube claiming that they are a platform and not a publisher, while then immediately saying they censor people and content because of “hate speech” implying that doesn’t make them a publisher (hint: it does)


Fuck YouTube and Susan Wojcicki. They claim to give everyone equal access but that’s absolute garbage. They continue to demonetize and adjust their algorithms to essentially de-platform those they haven’t outright banned. Then they wave their hands and say we’re a private company and we can do whatever we want. if that’s the tact that they want to take then they need to lose their section 230 protection, plain and simple.


Responsible Americans.

Again, it’s not suppression of speech. The government doesn’t control these platforms. There are other such platforms, and the fringe right is free to create there own to peddle whatever they wish.

Nothing’s changed. I oppose partisanship in politics. You should too.

Funny how this happens right before the next big YouTube purge. These guys are obviously coordinating to pacify the public as always. One might even call it a…conspiracy!

Everything seems conspiracy to you guys. The public doesn’t like the filthy hate filled and bigoted conspiracies that guys the likes of Alex Jones spews.

Freedom requires responsibility. Be responsible with your speech or loose it.

That isn’t up to you to decide. If people don’t want to watch it they are free (see that free?) not to watch it.

I hate seeing transvestites and global warming lunatics spew their mental derangement but I don’t call for banning their right to speak.

Again, not up to you to decide what is and what is not objectionable to others.

1 Like