{"id":17994,"date":"2024-02-01T16:33:16","date_gmt":"2024-02-01T20:33:16","guid":{"rendered":"https:\/\/illumin.com\/?p=17994"},"modified":"2025-01-30T06:05:45","modified_gmt":"2025-01-30T11:05:45","slug":"social-media-moderation-in-2024","status":"publish","type":"post","link":"https:\/\/illumin.com\/insights\/blog\/social-media-moderation-in-2024\/","title":{"rendered":"How brands handle social media moderation in 2024"},"content":{"rendered":"<p><span style=\"font-weight: 400;\">Disinformation and hateful behavior are an unfortunate problem for social media platforms and social media marketers alike, which is why social media moderation is so important. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">While some platforms handle moderation better than others, recent events have solidified the importance of clear policies and effective enforcement for protecting brand integrity.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Ever since <a href=\"https:\/\/illumin.com\/insights\/blog\/history-of-social-media-marketing\/\">social media\u2019s meteoric rise<\/a>, community guidelines have proven essential to social media platforms\u2019 business models. While some take moderation very seriously, others don\u2019t \u2013 and they pay the price.\u00a0<\/span><\/p>\n<p>&nbsp;<\/p>\n<h2><span style=\"font-weight: 400;\">X is losing advertisers<\/span><\/h2>\n<p><span style=\"font-weight: 400;\">X (formerly Twitter) is <\/span><a href=\"https:\/\/www.nytimes.com\/2023\/11\/24\/business\/x-elon-musk-advertisers.html\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">losing money rapidly<\/span><\/a><span style=\"font-weight: 400;\"> because of poor content moderation and posts from its owner.\u00a0\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">IBM, Apple, Disney, Airbnb, Coca-Cola, Microsoft, and more halted ads on X (formerly Twitter) in Nov 2023 after owner Elon Musk endorsed an <\/span><a href=\"https:\/\/www.nytimes.com\/2023\/11\/16\/technology\/elon-musk-endorses-antisemitic-post-ibm.html\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">antisemitic conspiracy theory<\/span><\/a><span style=\"font-weight: 400;\">. Ad pull-outs were estimated to reach $75 million in advertising revenue loss by the end of 2023 \u2013 just two months.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The New York Times reports that according to internal documents, more than 200 ad units of companies halted or considered pausing their ads on the social network.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">X also discontinued one of its moderation tools \u2013 one that detected coordinated misinformation. The platform also removed a feature that once spotted accounts that shared identical media, which has been crucial for finding and stopping disinformation campaigns.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Stepping away from content moderation has landed X in hot water with both advertisers and governments. The EU has issued a formal warning to the company about disinformation on the platform and X could face major fines if the issue isn\u2019t resolved.\u00a0<\/span><\/p>\n<p>&nbsp;<\/p>\n<h2><span style=\"font-weight: 400;\">Social media moderation is essential for compliance<\/span><\/h2>\n<p><span style=\"font-weight: 400;\">X isn\u2019t the only platform to run into hurdles because of poor community management, <a href=\"https:\/\/illumin.com\/insights\/guides\/meta-advertising-what-you-need-to-know\/\">Meta <\/a>has also had its fair share of content issues, with its <\/span><a href=\"https:\/\/content-na1.emarketer.com\/social-media-misinformation-inflection-point?\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">Oversight Board currently examining<\/span><\/a> <span style=\"font-weight: 400;\">a manipulated video of American President Joe Biden. The video will likely impact Meta\u2019s Manipulated Media policies in 2024, ahead of the US election.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">This investigation comes as global governments look into the impact of disinformation campaigns and altered media on elections and the responsibility social media platforms have to tackle misrepresentation.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Meta also suspended COVID-19-related searches on Threads, exemplifying the importance of social media moderation to the success and growth of its brand as well as to maintaining consumer trust and attracting advertisers.\u00a0\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Many countries and governments have passed legislation to protect their citizens from disinformation online, and it behooves social media platforms to understand this legislation and comply with its terms. The European Union\u2019s Digital Services Act (DSA) contains <\/span><a href=\"https:\/\/content-na1.emarketer.com\/google-meta-x-face-stricter-content-moderation-rules-eu?_gl=1*8ani0g*_ga*MTMxOTMyNzk1MC4xNjc1NDQ0ODI3*_ga_XXYLHB9SXG*MTcwMjkyNTU1My4zOTkuMS4xNzAyOTI1NjIwLjYwLjAuMA..*_gcl_au*NjE5MzU2NDg5LjE2OTgyMjkzMzIuMTE3MjIyNzg5Ni4xNzAyOTE3ODg1LjE3MDI5MTgwMzI.\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">strict content moderation rules<\/span><\/a><span style=\"font-weight: 400;\"> and requires any platform operating in the EU to follow them or face severe penalties.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">This is just one piece of legislation passed in recent years and it marks a significant shift in digital regulation. The DSA compels companies like Meta and X to prioritize content moderation and conduct a proper risk analysis to protect European citizens\u2019 privacy and to prevent misinformation and the misrepresentation of public figures.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The DSA applies to 19 companies, all of which it defines as &#8220;<\/span><a href=\"https:\/\/content-na1.emarketer.com\/google-meta-x-face-stricter-content-moderation-rules-eu?_gl=1*8ani0g*_ga*MTMxOTMyNzk1MC4xNjc1NDQ0ODI3*_ga_XXYLHB9SXG*MTcwMjkyNTU1My4zOTkuMS4xNzAyOTI1NjIwLjYwLjAuMA..*_gcl_au*NjE5MzU2NDg5LjE2OTgyMjkzMzIuMTE3MjIyNzg5Ni4xNzAyOTE3ODg1LjE3MDI5MTgwMzI\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">very large online platforms<\/span><\/a><span style=\"font-weight: 400;\">&#8220;, having more than 45 million monthly users. This is likely only a first step from the EU in regulating user-generated digital content \u2013 other countries are poised to enact legislation in the coming years as well.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The writing is on the wall; effective social media moderation is a critical part of successfully running a social media platform. Advertisers are paying close attention to which organizations take the task seriously. When user-generated content becomes hateful or works to spread disinformation, consumer trust is lost and advertisers are quick to move on. Instead, they will opt to spend their ad dollars where they can more effectively build trust with their audience.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Trust is central to effective advertising \u2013 and that makes social media moderation essential for building an advertiser-friendly platform.<\/span><\/p>\n<p>&nbsp;<\/p>\n<div class=\"micro-cta-wrap pos-relative\"><div class=\"micro-cta-main radius-24 flex pos-relative\"><div class=\"micro-cta-image\"><figure class=\"micro-cta-thumb object-fit pos-relative\"><img decoding=\"async\" src=\"https:\/\/illumin.com\/wp-content\/uploads\/2023\/03\/demo_CTA-2.png\" alt=\"micro-cta@2x\" title=\"\"><\/figure><\/div><div class=\"micro-cta-text pos-relative\"><span class=\"optional-text\">Made for marketers<\/span><div class=\"h3\">Learn how illumin unlocks the power of journey advertising<\/div><a href=\"https:\/\/illumin.com\/product\/request-a-demo\/\" class=\"button\">Get started!<\/a><\/div><\/div><\/div>\n<p><span style=\"font-weight: 400;\">To see more from illumin, be sure to follow us on <a href=\"https:\/\/twitter.com\/illuminHQ\" target=\"_blank\" rel=\"noopener\">X<\/a><\/span><span style=\"font-weight: 400;\">\u00a0<\/span><span style=\"font-weight: 400;\">and <\/span><a href=\"https:\/\/www.linkedin.com\/company\/illuminhq\/\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">LinkedIn<\/span><\/a><span style=\"font-weight: 400;\"> where we share interesting news and insights from the worlds of ad tech and advertising.<\/span><\/p>\n<p><a class=\"twitter-follow-button\" href=\"https:\/\/twitter.com\/illuminHQ?ref_src=twsrc%5Etfw\" data-show-count=\"false\" target=\"_blank\" rel=\"noopener\">Follow @illuminHQ<\/a><script src=\"https:\/\/platform.twitter.com\/widgets.js\" async=\"\" charset=\"utf-8\"><\/script><\/p>\n<p><script src=\"\/\/platform.linkedin.com\/in.js\" type=\"text\/javascript\"> \u00a0 lang: en_US <\/script><script type=\"IN\/FollowCompany\" data-id=\"1189312\" data-counter=\"right\"><\/script><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Disinformation and hateful behavior are an unfortunate problem for social media platforms and social media marketers alike, which is why social media moderation is so important. While some platforms handle moderation better than others, recent events have solidified the importance of clear policies and effective enforcement for protecting brand integrity.\u00a0 Ever since social media\u2019s meteoric [&hellip;]<\/p>\n","protected":false},"author":10,"featured_media":17998,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"content-type":"","footnotes":""},"categories":[31,19],"tags":[],"class_list":["post-17994","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-blog","category-insights"],"acf":[],"_links":{"self":[{"href":"https:\/\/illumin.com\/wp-json\/wp\/v2\/posts\/17994","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/illumin.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/illumin.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/illumin.com\/wp-json\/wp\/v2\/users\/10"}],"replies":[{"embeddable":true,"href":"https:\/\/illumin.com\/wp-json\/wp\/v2\/comments?post=17994"}],"version-history":[{"count":0,"href":"https:\/\/illumin.com\/wp-json\/wp\/v2\/posts\/17994\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/illumin.com\/wp-json\/wp\/v2\/media\/17998"}],"wp:attachment":[{"href":"https:\/\/illumin.com\/wp-json\/wp\/v2\/media?parent=17994"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/illumin.com\/wp-json\/wp\/v2\/categories?post=17994"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/illumin.com\/wp-json\/wp\/v2\/tags?post=17994"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}