Last month, we told you that YouTube was hiring 10,000 moderators to be part of its "Intelligence Desk" put in charge of finding controversial and inappropriate videos before they go viral
. Corporations were finding themselves promoting videos with conspiracy theories and far-right leanings that got them in trouble with the majority of their customers who have a more moderate set of beliefs.
But the Intelligence Desk really didn't do such a great job following the school shooting in Parkland Florida earlier this month. Videos championed certain conspiracy theories that were based on the premise that the shootings were faked and that the survivors were "crisis actors" who show up at faked shootings across the country at the behest of liberals seeking to have the second amendment removed. These were the kind of videos that the 10,000 moderators were supposed to quash before they became sponsored by corporations whose customers were aghast after viewing these clips that are out-of-touch with any semblance of reality.
Parkland survivor David Hogg was branded a crisis actor on conspiracy videos that should have been removed by YouTube
That the moderators failed in their job was seen by the fact that a particular video calling Parkland student David Hogg a crisis actor was the number one trending video on YouTube. Besides failing to prevent controversial material from reaching impressionable eyes, the moderators might have removed certain channels that while still "wide-right" to quote your typical NFL play-by-play man, still deserved freedom of speech protection. Pro-gun videos were also removed from the site.
YouTube realized that it went too far. In a statement, the streaming video site said that newer members of its enforcement team took down some content by mistake. YouTube's official policy says that "harmful or dangerous" and "hateful" videos can be removed. If a video creator has three violations within a three month period, YouTube will terminate the account.
Any video removed by mistake will be reinstated by YouTube.
"As we work to hire rapidly and ramp up our policy enforcement teams throughout 2018, newer members may misapply some of our policies resulting in mistaken removals. We’re continuing to enforce our existing policies regarding harmful and dangerous content, they have not changed. We’ll reinstate any videos that were removed in error."-YouTube spokesman