- Point of view
Moderating content during an infodemic
Embed robust trust-and-safety measures to increase resilience
Though people are distancing themselves physically, they're staying close virtually. Around the globe, people are socializing in new ways, turning more frequently to online platforms, posts, and pundits for news, communication, and entertainment. But at the same time, misinformation is multiplying. As people wade through a flood of content, they face a growing array of fraudulent or toxic material. The World Health Organization calls this explosion of myth and rumor a “massive infodemic." That's why it's so important for companies managing user-generated content to get a solid handle on trust-and-safety issues. It's the content moderators who scour the internet to make it safe by catching and eliminating irresponsible posts that are key to this endeavor.
Maintaining the first line of defense
Leading social media firms are doing their part. They're aggressively imposing stronger rules and guidelines about what's acceptable for users to post. They're moving swiftly to delete inaccurate or purposely misleading material that moderators find. They're also promoting content from health officials and other trusted authorities.
These companies are making this a top priority for good reason. By ensuring that the highest standards are protecting users' online experiences during unsettling times, they're securing customer loyalty today and for the long term.
New opportunities arise from such efforts, as well. If your firm is revamping to guard against exploitative advertisements, toxic content, and fake news, you can also improve the customer experience and gain a competitive edge. For example, using predictive insights drawn from artificial intelligence (AI), you can alert users to potential fraud or other malicious activity and remove damaging content before they suffer any harm.
But to emerge stronger from these unsettling times, there are first a number of challenges to overcome.
Content moderation: adapt swiftly
Moderators, the first in line to combat harmful online content, are working alongside AI to protect people from toxic material and help safeguard a company's reputation. Now, social distancing guidelines have compelled companies and employees to adjust to remote working en masse. And even after the pandemic subsides, expanded work-from-home arrangements will be part of the new norm.
Businesses are quickly adapting to the new work conditions. By now, you've likely determined how to share data off site without breaching privacy. But you're grappling with other issues too. How can you continue to protect the mental health of workers when they no longer have access to on-site counseling? How can you shield workers' families from sensitive images?
The most resilient enterprises are developing thoughtful, fast, and innovative responses to these and other questions. They're evolving quickly, learning new ways of working cooperatively and virtually, and forging new styles of leadership.
All hands on deck, AI included
These companies also considering the right division of responsibilities between digital and human workforces – and in the face of COVID-19, the balance between AI and human moderators is shifting. Just as top-flight companies have created processes to support remote working, they're also expanding their use of AI to moderate content. Your company needs to make decisions about content faster. And AI, which digests and interprets data rapidly, can help.
The most sophisticated enterprises will take AI a step further. They'll use AI to gain predictive insights into misinformation trends. And they'll train AI to close any loopholes that can let faulty content slip through.
Though technology plays a major role, it takes people to capture the nuances and subtleties in content that machines miss – and that's especially true today. With users sharing rising volumes of content, there's a greater need for human intervention. Even well-meaning but misleading health advice circulates rapidly on social media. Moderators' efforts now have added urgency and pressure.
Given these dual challenges – new working conditions and spikes in volumes – you can expect a backlog of content to build up for moderators to clear. You'll need all hands on deck to address the additional workloads and stress moderators face.
Supporting your content moderators
You likely offer mental-health and well-being support at the office. It's important to make those services available to remote workers too. If your company has a guidebook to help workers manage their exposure to exploitative material, update it with the at-home worker in mind.
For example, you might include suggestions for winding down from a rough day while surrounded by family and without the pause that commuting offers. You can also run online forums and support groups to keep the bonds that people have developed at the office strong. And you can hold virtual workshops that teach skills and competencies for professional development and well-being.
Lock in security while expanding communications channels
In your leadership role, you'll be helping people develop different habits as they collaborate and exchange ideas while physically apart. Communication channels and collaboration tools have to perform without a hitch for teams to do their best remotely.
At the same time, you need robust measures to maintain security and privacy, and to comply with all relevant regulations. Protecting these aspects of online life is critical. For example, you need to safeguard all sensitive and proprietary material against downloading and protect it behind a firewall. Establish rules against the use of public Wi-Fi, which can compromise data security, and guard against the risks of remote working, such as exposure to malware.
Steps to adapt to today's shifting world and new future
Clearly, there are many issues to unpack. Start by taking a good look at your existing trust-and-safety infrastructure so you can strengthen and upgrade it to meet the conditions COVID-19 has created.
Visit our trust and safety page
Some approaches for your content moderators to consider:
- Update moderators' protocols and practices. Include cost-benefit frameworks for treating content in context. For example, a protocol could authorize moderators to remove false information about COVID-19 because the harm it could cause outweighs the free speech rights of the poster
- Invest in virtual training sessions that refresh staff on procedures and help onboard new personnel. Courses should include policy nuances, decision-making, cultural sensitivities, and COVID-19-specific issues
- Improve tools and processes for virtual content moderation. For example, mask sensitive content on screen while people work from home to protect their families. Workers can unmask it manually as necessary
- Focus on high-risk priorities, such as live videos and streaming, as well as threats that impact your brand image and platform
- Offer moderators one-to-one virtual counseling sessions. Apply proactive practices and reactive interventions to maintain their well-being
- Upgrade communication tools, such as video conferencing, messaging services, and company calendars, so that teams can work seamlessly from home
Embed digital technologies effectively:
- Combine AI and human content moderators to manage changing volumes of content and adapt at speed
- Protect user data and access with robust information security. Among other features, it should restrict public Wi-Fi use, wall off sensitive company intelligence, and have the latest malware safeguards in place
- Undertake operation modernization. Upgrade technological interventions where possible to make processes and operations more efficient and effective. Make sure you're using workflow systems and automation technologies to their best advantage
- Self-disrupt. Ask yourself: What have you learned from using technology in new ways that can help you improve your customers' experience in the long term? What potential revenue streams have new COVID-19 conditions revealed? Don't waste a crisis
In uncertain times, adaptability is key
The trust-and-safety measures we've outlined are crucial to building flexibility. Though COVID-19 caught much of the world short, your organization can emerge from the storm stronger with a more resilient business and workforce. By showing leadership and confidence in the future, you're also attracting and retaining the best talent. And the care you provide buoys workers, for their part.
Firms that respond to the impact of the pandemic with innovative approaches to content moderation will certainly protect their revenue and business continuity. They're also building trust among users and a more loyal customer base for the future, strengthening their competitive position.
More to the point, however, they're performing a vital service. They're protecting people from online abuse and misinformation, combating the infodemic, and making their web communities sturdier and safer.