While the threat theory that artificial intelligence will replace human jobs rages on, there is one long overshadowed job that only humans can currently perform - commercial content reviewer. The reason AI can't do it right now is that the job is extremely flexible, and assessing whether a piece of content meets publishing standards requires a combination of cultural, religious, and other factors.
The Hidden Digital Labor: Commercial Content Auditing
Commercial content review has been hidden for years, a mechanism that includes outsourcing companies that provide the labor, large online platforms that have a need for it, and reviewers who keep harmful images and information out of the user's view. The mechanism is now getting attention at the level of academic research, news coverage, technology research, and policy making, yet its true nature remains very vague, and the social media companies that rely on it for their business are reluctant to discuss this aspect.
To cut through the barriers put up by tech companies and understand what they use behind the scenes to determine what you see on your screen, you have to talk directly to the auditors; and there are a few hiccups: most of the auditors' jobs are not very stable; the companies usually force them to sign confidentiality agreements; and there are deadlines for the positions, which cause them to move in and out of the employment pool. Another challenge is that this digital labor can be "rebranded" and transferred globally under different names, with reviewers dispersed across a global network of outsourcing companies, far from the platforms that ultimately benefit from their work.
Why commercial content review relies on human labor
Commercial content audits may take place before or after the content is published, with user complaints in particular triggering the platform's content audit mechanism and triggering the intervention of professional auditors.
Commercial content review is a critical part of the production cycle for commercial websites, social media platforms, and media outlets that need to sustain their online business through user-generated content (UGC). Content moderation is critical for companies that need this service to protect their brand and platforms (by enforcing user compliance with site rules), to ensure that they are operating in compliance with laws and regulations, and to help them retain users who are willing to view and upload content on their platforms.
The amount of user-generated content on many high-traffic websites is staggering and continues to grow. Scale aside, the process of properly sifting through user-generated content is incredibly complex and far beyond the capabilities of software and algorithms. This is not only a technical or computational problem, but reviewing content on such a large scale remains a challenge because the theoretical challenges of categorizing information have not been solved for a long time.
Some content lends itself to localized batch processing or filtering by other automated means, especially content that has already appeared and been flagged as undesirable in databases. But the process is so complex, and there are so many factors that need to be taken into account and balanced at the same time, that the vast majority of content uploaded by users still requires human intervention to be properly screened, especially content that contains videos and images. Human reviewers use a range of advanced cognitive functions and cultural literacy to determine whether content is appropriate for the site or platform.
Reviewers Cannot Influence Review Criteria
MegaTech's reviewers are required to adhere to a set of internal policies when reviewing content, which are set by their supervisors, full-time employees in the Security and Policy department. These extremely detailed internal policies, which are the criteria by which commercial content reviewers judge content, are not disclosed to the community, but are used internally as trade secrets in the service of the company's brand management. Part of the reason for the secrecy is to prevent unsuspecting users from exploiting loopholes and posting objectionable content on the platform by playing ball or intentionally circumventing the rules.
In addition, disclosure of commercial content review policies can inform outsiders about the nature of the work of the company's reviewers. If the internal policy is exposed, the kind of content that triggers the review mechanism will also be revealed, and the platform with a shiny image will reveal its extremely ugly side. Exposure of internal policies can also raise difficult questions about the values underlying the policies, and the auditors surely understand that.
They are the true cost of internet platforms
Commercial content auditing is indispensable to employers and users of all platforms, and until computing power and computer vision make leaps and bounds, it will continue to require human intervention for the foreseeable future. Even then, human labor will likely remain the preferred option, and it will follow the trajectory of globalization, moving to places with large resources of cheap labor.
Commercial content reviewers make a series of decisions about each piece of user-generated content that are more complex than any algorithm or filter can accomplish. Cultural nuances and linguistic idiosyncrasies add to this difficulty. The human brain is an unparalleled supercomputer, storing vast amounts of cultural knowledge and life experience data, and our minds are the equivalent of complex meaning recognition software, so human employees remain superior to any machine in terms of cost and capability.
Social media platforms hosting user-generated content show no signs of imminent disappearance, but instead continue to evolve as the popularity of mobile devices and the number of globally connected people increases. Human nature is also unlikely to change much. So the need for commercial content review will continue to exist.