AI is not the only answer
Leading social content platforms are investing heavily to develop highly sophisticated AI that automatically detects inappropriate content. Their effectiveness, however, varies according to the type of content being monitored. Not surprisingly, AI is better at identifying nudity than hate speech. Hate speech is more subjective and harder to detect as it uses language nuances, slang and emojis.
That said, categorizing nudity in content is not without its challenges for AI. For example, Nick Ut's famous 1972 photograph, The Terror of War, shows a little girl running naked as her village burns behind her. She is nude, but should the photo be deleted given it's historic significance?
Misclassification can happen, but when you combine technology with people, you add an extra layer of context that AI cannot currently achieve. You can also apply policies and training in a more contextual way. For example, a photo of a man without his shirt on in a family vacation shot may be nudity but, in context, is not offensive.