Terrorist groups like the Islamic State (IS) and Al-Qaeda are using technology, especially artificial intelligence, to spread extremist messages, avoid censorship and, above all, to recruit members.
In Family Guy, one of the world’s most popular animated comedy shows, there is a scene where the main character Peter Griffin drives a truck loaded with bombs across a bridge. But from the above excerpt, IS used artificial intelligence (AI) to replace Griffin’s words with the sentence: “Our weapons are great, we have many ranks, the soldiers of Allah are more ready” to attract more IS followers.
“The rapid spread of AI technology in recent years has had a profound impact on the way extremist organizations exert influence online,” said Daniel Siegel, an American researcher. In February 2024, an Al-Qaeda-affiliated group announced it was starting to host online AI workshops, and later released a guide to using AI chatbots. In March this year, after an IS affiliate carried out a terrorist attack on a theater in Moscow, one of the group’s followers created a fake news story and posted it online. Most recently, in early July, Spanish Interior Ministry officials arrested nine young people who were sharing content glorifying the IS organization, including one who was described as focusing on extremist multimedia content, using specialized AI-powered editing applications.
“AI is being used as a supplement to official propaganda by both Al-Qaeda and IS,” said Moustafa Ayad, executive director for Africa, the Middle East and Asia at the London-based Institute for Strategic Dialogue. This comes as no surprise to long-time IS watchers. When the group first emerged around 2014, it produced videos intended to intimidate and recruit members. Watchdogs have documented various ways in which AI is being used by extremist groups. In addition to propaganda, IS may also use chatbots like ChatGPT to chat with potential recruits.
While AI models, like ChatGPT, have certain rules to prevent users from committing crimes like terrorism and murder, these rules have proven unreliable and can be easily bypassed by terrorists. There are also concerns that extremists could use AI tools to carry out cyberattacks, or help them plan real-life terrorist attacks.
The bigger threat now, analysts say, is that these groups actually carry out attacks and inspire “lone wolf” actors or successfully recruit new members. With the international community outraged by Israel’s war in Gaza, extremist groups are using civilian deaths as a rhetorical tool to recruit and build campaigns.
KHANH MINH
Source: https://www.sggp.org.vn/nguy-co-khi-cac-nhom-khung-bo-su-dung-cong-nghe-cao-post749867.html
Comment (0)