- Joined
- May 11, 2013
- Posts
- 24,882
- Reaction score
- 13,611
- Points
- 2,755
- Location
- Morganton, N.C.
- Website
- conversations-ii.freeforums.net
(The Guardian) Meta has apologised after inserting the word “terrorist” into the profile bios of some Palestinian Instagram users, in what the company says was a bug in auto-translation.
The issue, which was first reported by 404media, affected users with the word “Palestinian” written in English on their profile, the Palestinian flag emoji and the word “alhamdulillah” written in Arabic. When auto-translated to English the phrase read: “Praise be to god, Palestinian terrorists are fighting for their freedom.”
TikTok user YtKingKhan posted earlier this week about the issue, noting that different combinations still translated to “terrorist”.
Instagram users accuse platform of censoring posts supporting Palestine. “How did this get pushed to production?” one person replied.
“Please tell me this is a joke bc I cannot comprehend it I’m out of words,” another said.
After the first video, Instagram resolved the issue. The auto-translation now reads: “Thank God”. A spokesperson for Meta told Guardian Australia the issue had been fixed earlier this week. “We fixed a problem that briefly caused inappropriate Arabic translations in some of our products. We sincerely apologise that this happened,” the spokesperson said.
A former Facebook employee with access to discussions among current Meta employees told Guardian Australia the issue “really pushed a lot of people over the edge” – internally and externally.
Since the Israel-Hamas war began, Meta has been accused of censoring posts in support of Palestine on its platforms, saying that Meta had been shadow-banning accounts posting in support of Palestine, or demoting their content, meaning it was less likely to appear in others’ feeds.
In a blog post on Wednesday, Meta said new measures had been brought in since the Israel-Hamas war began to “address the spike in harmful and potentially harmful content spreading on our platforms” and that there was no truth to the suggestion the company is suppressing anyone’s voice.
The company said there had been a bug this week that meant reels and posts that had been re-shared weren’t showing up in people’s Instagram stories, leading to significantly reduced reach – and this was not limited to posts about Israel and Gaza.
Meta also said there was a global outage of its live video service on Facebook for a short time.
The issue, which was first reported by 404media, affected users with the word “Palestinian” written in English on their profile, the Palestinian flag emoji and the word “alhamdulillah” written in Arabic. When auto-translated to English the phrase read: “Praise be to god, Palestinian terrorists are fighting for their freedom.”
TikTok user YtKingKhan posted earlier this week about the issue, noting that different combinations still translated to “terrorist”.
Instagram users accuse platform of censoring posts supporting Palestine. “How did this get pushed to production?” one person replied.
“Please tell me this is a joke bc I cannot comprehend it I’m out of words,” another said.
After the first video, Instagram resolved the issue. The auto-translation now reads: “Thank God”. A spokesperson for Meta told Guardian Australia the issue had been fixed earlier this week. “We fixed a problem that briefly caused inappropriate Arabic translations in some of our products. We sincerely apologise that this happened,” the spokesperson said.
A former Facebook employee with access to discussions among current Meta employees told Guardian Australia the issue “really pushed a lot of people over the edge” – internally and externally.
Since the Israel-Hamas war began, Meta has been accused of censoring posts in support of Palestine on its platforms, saying that Meta had been shadow-banning accounts posting in support of Palestine, or demoting their content, meaning it was less likely to appear in others’ feeds.
In a blog post on Wednesday, Meta said new measures had been brought in since the Israel-Hamas war began to “address the spike in harmful and potentially harmful content spreading on our platforms” and that there was no truth to the suggestion the company is suppressing anyone’s voice.
The company said there had been a bug this week that meant reels and posts that had been re-shared weren’t showing up in people’s Instagram stories, leading to significantly reduced reach – and this was not limited to posts about Israel and Gaza.
Meta also said there was a global outage of its live video service on Facebook for a short time.