Humans and AI must be paired for effective online content moderation, survey finds

0
Humans and AI must be paired for effective online content moderation, survey finds

According to an August 11 Pollfish survey of 1,000 Americans conducted for TELUS International, a leader in digital customer experience, the vast majority (92%) of consumers surveyed think it’s important or very important that online content be reviewed by humans, not just AI. Nearly three-quarters (73%) of those surveyed believe that AI cannot understand or distinguish context and tone as well as a human.

TELUS International designs, produces and delivers next-generation digital solutions to enhance the customer experience for global market-making brands. Our services support our clients’ full digital transformation lifecycle and enable them to more quickly adopt next-generation digital technologies to improve their bottom line. TELUS International’s integrated solutions encompass digital strategy, innovation, consulting and design, IT lifecycle management (managed solutions, intelligent automation and comprehensive AI-based data solutions such as computer vision), omnichannel customer experience, and trust and security, including content moderation.

Content Moderation

Its content moderation specialists review and moderate generated content (text, images, video, audio) to ensure that it meets not only community guidelines but also local and government regulations. To moderate effectively, it combines human intervention with technological automation to ensure that content remains appropriate and relevant.

Siobhan Hanna, Senior General Manager, AI Data Solutions, at TELUS International, states:

“AI is becoming increasingly effective at detecting digital content that violates brand standards and community guidelines. While it has proven to be very useful as a first line of defense against harmful content, it is almost impossible for it to keep pace with the new types of content that are constantly emerging and the increased use of algolanguage. Humans are still needed to make more contextual decisions, as AI is limited in its ability to make sometimes difficult decisions that take into account the intent behind a particular sentence or image. By incorporating a human into the approach, brands can leverage the speed and efficiency of AI and ensure that nuanced content is reviewed correctly.”

Increasingly complex content moderation

For more than half of respondents (53%), it has become more difficult for brands, social networks and gaming platforms to monitor content on their sites over the past year. According to them, the main difficulties come from the fact that:

  • Each platform and channel has more users (66%);
  • Complaining online is becoming more common (54%);
  • Younger generations are turning more to digital (50%);
  • Content is published in more languages (29%);
  • 5G connectivity has increased access to digital networks globally (19%).

Siobhan Hanna concludes:

“Since more and more people are expressing themselves on different digital platforms in many languages, moderation cannot be done effectively by AI or humans alone. A robust content moderation strategy, based on different types of AI whose algorithms rely on the trusted datasets of a team of annotators, ensures that data is accurate, context is understood, and bias is responsibly mitigated. AI in content moderation tools will continue to improve, but human moderators will always be a necessary resource for keeping digital spaces safe. For this reason, it’s important that brands support content moderators with a robust wellness program that allows them to do the best job possible while protecting both their mental and physical health.”

Translated from Selon un sondage, humains et IA doivent être associés pour une modération de contenu en ligne efficace