The Vietnamese government has mandated that international social media platforms deploy artificial intelligence (AI) models capable of identifying and removing “toxic” content automatically. This directive is part of the country’s ongoing efforts to regulate online activity more stringently, according to reports released by state media on Friday.
Vietnam has consistently urged major firms such as Meta’s Facebook, Google’s YouTube, and TikTok to collaborate with local authorities in eliminating content categorized as “toxic,” which includes offensive, incorrect, and anti-government material.
“This marks the first time the government has issued such an order,” state-controlled broadcaster Vietnam Television (VTV) reported during a mid-year review event hosted by the information ministry, which was attended by select media representatives.
Details regarding the timeline and methods by which these cross-border platforms are expected to comply with the new requirements were not disclosed in the report.
In the first half of this year, Facebook complied with government requests by removing 2,549 posts, while YouTube took down 6,101 videos, and TikTok eliminated 415 links, as stated by the information ministry.
This announcement coincides with other Southeast Asian nations developing governance and ethical frameworks for AI technology, aimed at creating safeguards for its rapid expansion, as reported by Reuters earlier this month.
In recent years, Vietnam has enacted various regulations along with a cybersecurity law targeting foreign social media platforms. These measures are intended to combat misinformation and require foreign tech companies to set up local offices and store data within the country.
Last month, the information ministry conducted a thorough inspection of TikTok’s local operations, revealing numerous violations concerning the platform’s practices.
During the recent event, VTV also reported that the American streaming giant Netflix has submitted necessary documentation to establish a local office in Vietnam.
© Thomson Reuters 2023