- Point of view
Moderating content during an infodemic
Embed robust trust-and-safety measures to increase resilience
Though people are distancing themselves physically, they're staying close virtually. Around the globe, people are socializing in new ways, turning more frequently to online platforms, posts, and pundits for news, communication, and entertainment. But at the same time, misinformation is multiplying. As people wade through a flood of content, they face a growing array of fraudulent or toxic material. The World Health Organization calls this explosion of myth and rumor a “massive infodemic." That's why it's so important for companies managing user-generated content to get a solid handle on trust-and-safety issues. It's the content moderators who scour the internet to make it safe by catching and eliminating irresponsible posts that are key to this endeavor.
Leading social media firms are doing their part. They're aggressively imposing stronger rules and guidelines about what's acceptable for users to post. They're moving swiftly to delete inaccurate or purposely misleading material that moderators find. They're also promoting content from health officials and other trusted authorities.
These companies are making this a top priority for good reason. By ensuring that the highest standards are protecting users' online experiences during unsettling times, they're securing customer loyalty today and for the long term.
New opportunities arise from such efforts, as well. If your firm is revamping to guard against exploitative advertisements, toxic content, and fake news, you can also improve the customer experience and gain a competitive edge. For example, using predictive insights drawn from artificial intelligence (AI), you can alert users to potential fraud or other malicious activity and remove damaging content before they suffer any harm.
But to emerge stronger from these unsettling times, there are first a number of challenges to overcome.
Content moderation: adapt swiftly
Moderators, the first in line to combat harmful online content, are working alongside AI to protect people from toxic material and help safeguard a company's reputation. Now, social distancing guidelines have compelled companies and employees to adjust to remote working en masse. And even after the pandemic subsides, expanded work-from-home arrangements will be part of the new norm.
Businesses are quickly adapting to the new work conditions. By now, you've likely determined how to share data off site without breaching privacy. But you're grappling with other issues too. How can you continue to protect the mental health of workers when they no longer have access to on-site counseling? How can you shield workers' families from sensitive images?
The most resilient enterprises are developing thoughtful, fast, and innovative responses to these and other questions. They're evolving quickly, learning new ways of working cooperatively and virtually, and forging new styles of leadership.
These companies also considering the right division of responsibilities between digital and human workforces – and in the face of COVID-19, the balance between AI and human moderators is shifting. Just as top-flight companies have created processes to support remote working, they're also expanding their use of AI to moderate content. Your company needs to make decisions about content faster. And AI, which digests and interprets data rapidly, can help.
The most sophisticated enterprises will take AI a step further. They'll use AI to gain predictive insights into misinformation trends. And they'll train AI to close any loopholes that can let faulty content slip through.
Though technology plays a major role, it takes people to capture the nuances and subtleties in content that machines miss – and that's especially true today. With users sharing rising volumes of content, there's a greater need for human intervention. Even well-meaning but misleading health advice circulates rapidly on social media. Moderators' efforts now have added urgency and pressure.
Given these dual challenges – new working conditions and spikes in volumes – you can expect a backlog of content to build up for moderators to clear. You'll need all hands on deck to address the additional workloads and stress moderators face.
Supporting your content moderators
You likely offer mental-health and well-being support at the office. It's important to make those services available to remote workers too. If your company has a guidebook to help workers manage their exposure to exploitative material, update it with the at-home worker in mind.
For example, you might include suggestions for winding down from a rough day while surrounded by family and without the pause that commuting offers. You can also run online forums and support groups to keep the bonds that people have developed at the office strong. And you can hold virtual workshops that teach skills and competencies for professional development and well-being.
Lock in security while expanding communications channels
In your leadership role, you'll be helping people develop different habits as they collaborate and exchange ideas while physically apart. Communication channels and collaboration tools have to perform without a hitch for teams to do their best remotely.
At the same time, you need robust measures to maintain security and privacy, and to comply with all relevant regulations. Protecting these aspects of online life is critical. For example, you need to safeguard all sensitive and proprietary material against downloading and protect it behind a firewall. Establish rules against the use of public Wi-Fi, which can compromise data security, and guard against the risks of remote working, such as exposure to malware.
Steps to adapt to today's shifting world and new future
Clearly, there are many issues to unpack. Start by taking a good look at your existing trust-and-safety infrastructure so you can strengthen and upgrade it to meet the conditions COVID-19 has created.
Some approaches for your content moderators to consider:
Embed digital technologies effectively:
The trust-and-safety measures we've outlined are crucial to building flexibility. Though COVID-19 caught much of the world short, your organization can emerge from the storm stronger with a more resilient business and workforce. By showing leadership and confidence in the future, you're also attracting and retaining the best talent. And the care you provide buoys workers, for their part.
Firms that respond to the impact of the pandemic with innovative approaches to content moderation will certainly protect their revenue and business continuity. They're also building trust among users and a more loyal customer base for the future, strengthening their competitive position.
More to the point, however, they're performing a vital service. They're protecting people from online abuse and misinformation, combating the infodemic, and making their web communities sturdier and safer.