Advertising on social media generates huge revenue for various brands. However, it can be risky for them if their ad is being shown next to inappropriate content as it diminishes the brand value and creates a negative association. It also runs the risk of targeting the wrong demographic, which can further damage the brand. Recently, in a shocking revelation, online dating platforms Bumble and Match have decided to suspend their advertising on Instagram after a report by The Wall Street Journal found their ads to be displayed next to content of explicit nature and child abuse content within the platform’s Reels feeds. Check here to know all about the incident:
Bumble and Match to stop advertising on Instagram?
Dating apps including Bumble and Match have stopped advertising on Instagram after their ads were being displayed next to child-sexualizing content. The Wall Street Journal conducted some tests using accounts that followed young gymnasts, cheerleaders, and influencers. The report found that Instagram’s algorithm surfaced explicit and inappropriate content, including risque footage of children and overtly sexual adult videos, alongside ads for major brands such as Bumble, Disney, Walmart, and more. This disappointing discovery led Bumble and Match to take immediate action.
We are now on WhatsApp. Click to join.
It was revealed that Instagram’s system placed content such as an ad for Bumble appearing between a video of someone interacting with a life-size latex doll and another featuring a young girl in a compromised position. However, we haven’t been able to verify the same.
Some brands have said that Meta is paying for independent audits to be carried out to determine if placing their ads near inappropriate content is harming their brand name.
Notably, other major brands like Disney, Pizza Hut, and Walmart were also affected by this issue. According to Meta the tests conducted by the Wall Street Journal, don’t represent what billions of users see. Meta did not give any response regarding the issue. However, a Meta spokesman told WSJ that in October, the company brought new brand safety tools that give advertisers greater control over the ad placement. He also said that Instagram either removes or reduces about four million videos every month which seems to violate the standards of Meta.
This incident highlights the urgent need for social media platforms to enhance their content moderation mechanisms and ensure a safer online environment for both users and advertisers.
One more thing! HT Tech is now on WhatsApp Channels! Follow us by clicking the link so you never miss any updates from the world of technology. Click here to join now!