Meta, the company that owns social platforms Facebook and Instagram, has been accused of running ads that directed users to online marketplaces to buy drugs and other illegal substances (collectively known as controlled substances). This comes as Meta is facing an investigation in the United States.
Meta continues to make money from ads that violate its own policies prohibiting the sale of illegal drugs, according to the Wall Street Journal. Hundreds of ads promoting illegal substances like cocaine and opiates continue to appear on Facebook and Instagram, according to a July WSJ investigation. The ads show images of prescription bottles, pills, blocks of cocaine, or images with calls to order. Federal authorities have been investigating Meta since March for its role in selling illegal drugs.
The nonprofit Tech Transparency Initiative (TTP), which investigates online platforms, reviewed Meta’s ad library between March and June and found more than 450 illegal drug ads on Facebook and Instagram. Katie Paul, director of TTP, said users could buy dangerous drugs or even scams directly on Facebook without going through the dark web. Mikayla Brown is one of the parents who believes Meta should be held responsible for her child’s overdose death.
Her son, Elijah Ott, 15, a high school student in California, died in September 2023. An autopsy found that Ott tested positive for a large amount of fentanyl, which was determined to be the cause of his death. Brown also found messages on her son's phone that were linked to an Instagram account that sold illegal drugs. In some cases, ads on Facebook and Instagram linked to private group chats on Meta's encrypted messaging service WhatsApp, from which addicts could easily buy illegal substances. US lawmakers have discussed the need to hold tech companies accountable for what third parties post on their platforms.
The Justice Department has expanded the reach of federal drug laws to hold internet platforms accountable when companies use them to violate the law. At a Senate hearing in January, several parents said Meta and other social media companies should be held responsible for the deaths of their children. Meta said it uses artificial intelligence (AI) tools to moderate ads on Facebook and Instagram, but that existing tools have not been able to block drug ads, and the ads often redirect users to other platforms where they can make purchases.
Meta is working with law enforcement to combat this type of activity, a company spokesperson said. The company’s content moderation teams have been overwhelmed by staff cuts in recent years. Meta expressed condolences to those who have suffered the tragic consequences of drugs and recognized the need to work together to prevent illicit substances.
KHANH MINH
Source: https://www.sggp.org.vn/mang-xa-hoi-bi-cao-buoc-quang-cao-chat-cam-post752172.html
Comment (0)