(GCR) — Artificial intelligence is quickly becoming an everyday part of our lives. Digital assistants have long helped us order products online and answer questions. Social media platforms use AI to serve up photos and videos tailored to our interests. Office workers have adopted chat bots to help write emails; teachers are on the lookout for students using them to write essays.
The potential uses for AI, beneficial and not, are seemingly boundless. But even the most avid visionaries of technology have recognized the harms that may come with the opportunities. Citing “profound risks to society and humanity,” Elon Musk, Steve Wozniak and Andrew Yang recently joined many others in an open letter calling for a six-month pause on AI experiments.
In the wrong hands, AI could spell trouble for persecuted Christians and religious minorities around the world. Oppressive governments, terrorist groups and other nefarious actors are already abusing and misusing digital technology for their evil ends. Adding the abilities of AI could worsen the lives of Christians and other vulnerable communities.
Global Christian Relief, the organization I lead, recently compiled a list of ways artificial intelligence could fuel global persecution.
Facial recognition software, abetted by AI technology, has made tracking and monitoring the movements of individual citizens and groups easier than ever. China, the world’s most proficient surveillance state, has installed 500 million cameras that can identify people out of crowds and log their movements. Recorded data can be stored, searched and easily paired with China’s social credit score system. China has now exported this technology to 60 countries.
AI-powered tools such as ChatGPT are easily susceptible to censorship by governments who want to target particular groups. Basic search engine results can also be manipulated. Asking a search engine “Where should I attend church?” may prompt responses along the lines of, “We don’t recommend going to church” or “Attending church will negatively impact your social score.”
Deepfakes can also be weaponized using a form of artificial intelligence called deep learning to digitally manipulate and replace a person’s likeness with another, allowing governments to fabricate events or speeches. Videos of pastors or faith leaders could be exploited by bad actors to make them say something blasphemous, insulting or illegal, giving enemies a pretext for harassment, imprisonment or violence.
In the U.S., police departments have begun embracing predictive algorithms to anticipate where crimes are likely to occur. Hostile governments could easily abuse this technology to predict where Christians and religious minorities are likely to meet. Those with evil intent could lie in wait until the religious group gathers and then move in to arrest, attack and kill.
Perhaps most frightening are autonomous weapons. Persecuted Christians already face incredible levels of violence at the hands of jihadists and extremists. If they fall into the wrong hands, lethal autonomous weapons, which use artificial intelligence to locate and destroy targets, pose an even greater threat to vulnerable groups. Terrorists are already testing delivery drones to carry explosive devices. The addition of attacks by air would be a catastrophic development for religious minorities.
These examples are just the tip of the iceberg. As artificial intelligence evolves ever faster, we need to slow down and allow more time to research its societal impacts around issues of ethics and safety. We also need to pass legislation now to regulate how AI is used and developed. Otherwise, for the persecuted and vulnerable around the world, it will be too late.
Dr. David Curry is president and CEO of Global Christian Relief, America’s leading watchdog organization focused on the plight of persecuted Christians worldwide. In addition to equipping the Western church to advocate and pray for the persecuted, GCR works in the most restrictive countries to protect and encourage Christians threatened by faith-based discrimination and violence.