Meta, the parent company of Facebook and Instagram, has released its latest report on content moderation in India. According to the report, in December 2023, Meta removed over 19.8 million pieces of content across 13 policies for Facebook and over 6.2 million pieces of content across 12 policies for Instagram in India. This is a significant increase from the previous month, where Facebook removed 10.5 million pieces of content and Instagram removed 2.5 million pieces of content in India.
In addition, during December 1-31, Facebook received 44,332 reports through the Indian grievance mechanism, which is a process for users to report content that violates their local laws. The reports were related to various issues such as hate speech, fake news, and harassment. Facebook said that it provided tools for users to resolve their issues in 33,072 cases, which is approximately 74.5% of the total reported cases.
These include pre-established channels to report content for specific violations, self-remediation flows where they can download their data, avenues to address account hacked issues, etc, Meta said in its monthly report in compliance with the IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.
“Of the other 11,260 reports where specialised review was needed, we reviewed content as per our policies and took action on 6,578 reports in total. The remaining 4,682 reports were reviewed but may not have been actioned,” Meta added. On Instagram, the company received 19,750 reports through the Indian grievance mechanism. “Of these, we provided tools for users to resolve their issues in 9,555 cases,” it said.
Of the other 10,195 reports where specialised review was needed, Meta reviewed content and took action on 6,028 reports in total. The remaining 4,167 reports were reviewed but may not have been actioned. Under the new IT Rules 2021, big digital and social media platforms, with more than 5 million users, have to publish monthly compliance reports.
“We measure the number of pieces of content (such as posts, photos, videos or comments) we take action on for going against our standards. Taking action could include removing a piece of content from Facebook or Instagram or covering photos or videos that may be disturbing to some audiences with a warning,” said Meta.
In November, Meta took down over 18.3 million pieces of content across 13 policies for Facebook and over 4.7 million pieces of content across 12 policies for Instagram.
Overall, Meta’s report highlights its efforts to combat harmful content on its platforms in India, where it has faced criticism for not doing enough to address content moderation issues.
— Written with inputs from IANSGet latest Tech and Auto news from Techlusive on our WhatsApp Channel, Facebook, X (Twitter), Instagram and YouTube.
Author Name | Shubham Verma