Technology News

OpenAI Shuts Down Iranian Group’s ChatGPT Accounts Over US Election Interference

< 1 min read

OpenAI announced on Friday that it had terminated the accounts of an Iranian group that had been using its ChatGPT chatbot to generate content intended to influence the U.S. presidential election and other sensitive issues.

The group, identified as Storm-2035, utilized ChatGPT to create content on topics such as U.S. election candidates, the conflict in Gaza, and Israel’s participation in the Olympic Games.

This content was then disseminated via social media accounts and websites.

An investigation by the Microsoft-backed AI company revealed that ChatGPT was used to produce both long-form articles and shorter social media comments.

However, OpenAI noted that the operation did not seem to have achieved significant engagement. Most of the identified social media posts received little to no interaction, and there was no evidence of the web articles being widely shared on social media.

The accounts involved have been banned from accessing OpenAI’s services, and the company continues to monitor for any further attempts to violate its policies.

In August, a Microsoft threat-intelligence report highlighted that the Iranian network Storm-2035, which operates through four websites posing as news outlets, was actively engaging U.S. voter groups with polarizing messages on issues such as U.S. presidential candidates, LGBTQ rights, and the Israel-Hamas conflict.

As the U.S. presidential election approaches on November 5, with Democratic candidate Kamala Harris and Republican rival Donald Trump in a close race, OpenAI disclosed that in May, it had disrupted five covert influence operations that sought to misuse its models for deceptive activities across the internet.

 

Thank you for reading. We hope this gives you a brief understanding of the latest news. Are you interested in reading other latest articles on Machine Learning and Data Science? Explore our Technology blogs for more.

Tagged , , , , , , , , , , , , , , ,