10,000 Googlers will help moderate YouTube videos | Engadget Today
Because wtf child abuse.
YouTube had its hands full lately, dealing with disturbing channels and videos masquerading as family-friendly offerings. Now, YouTube chief Susan Wojcicki has explained how the platform plans to keep a closer eye on the videos it hosts going forward by applying the lessons it learned fighting violent extremism content. Wojcicki says the company has begun training its algorithms to improve child safety on the platform and to be better at detecting hate speech. To be able to teach its algorithms which videos need to be removed and which can stay, though, it needs more people’s help. That’s why it aims to appoint as many as 10,000 people across Google to review content that might violate its policies.
YouTube says its machine-learning algorithms help take down 70 percent of violent extremist content within eight hours of upload. By training those algorithms to do the same for other types of videos, such as those questionable uploads that targeted children, the platform will be able to take them down a lot faster than it currently can.
Subscribe to Engadget on YouTube: http://engt.co/subscribe
Get More Engadget:
• Like us on Facebook: http://www.facebook.com/engadget
• Follow us on Twitter: http://www.twitter.com/engadget
• Follow us on Instagram: http://www.instagram.com/engadget
• Add us on Snapchat: https://www.snapchat.com/add/engadgetHQ
• Read more: http://www.engadget.com
Engadget is the definitive guide to this connected life.