The current situation where AI Monitoring Uyghurs
AI finds criminal suspects and potential criminals and notifies security authorities
Article on Jan. 25th 2022 byToshio Sasaki (Author and journalist. Member of the editorial board of the Ministry of Internal Affairs and Communications’ White Paper on Information and Communications. Member of the Broadcasting Program Council of FM Tokyo. He is a member of Information Network Law. After dropping out of the Political Science Department of the Waseda University School of Political Science and Economics, he joined the Mainichi Newspapers in 1988. In 1999, he moved to an IT publishing company and shifted his reporting focus to the technology field.)
AI (Artificial Intelligence) extracts potential criminals and the police crack down on them before they commit crimes. In China, which is willing to invest heavily in AI, some say that the surveillance system has improved society. Is AI an enemy of humanity or a friend?
This is a truly remarkable book. Jeffrey Kane’s “AI Jail Uyghur” is the first book of its kind to bring to light how technology is used for political oppression.
The most obvious example in this book is the “predictive policing” of Uyghurs, in which AI (Artificial Intelligence) and personal data are used to identify potential criminals and police them before they commit crimes. This may ring a bell with many of you. Yes, this is the 2002 movie “Minority Report” starring Tom Cruise. In the future in the U.S., where a murder prediction system has been put to practical use, the Crime Prevention Bureau of the police detects murderers before they happen, and the murder rate is kept at zero. This is exactly what is happening in Xinjiang today.
In the movie, it was set up that a psychic can predict crimes, but the predictive policing in Uyghur is done by Artificial Intelligence; the evolution of AI since the beginning of the 2010s has been tremendous. If we can collect enough data, we can instantly discover trends and characteristics that are completely unnoticeable to the human eye.
China’s Shock at AlphaGo
The first time we were shown how amazing the Go program AlphaGo was in 2015, when it shocked people by defeating the world’s top-class Go player, Lee Sedol 9-dan of South Korea, 4-1 in 2016. AlphaGo reads a huge number of past human game records and extracts data on what moves to make in what situations to get closer to victory. Furthermore, the programs play a large number of “self games” against each other, and they have acquired the ability to keep human players at bay.
It was not only people in the general society who were shocked by the emergence of AlphaGo. The Chinese Communist Party government was also shocked. Go is a game that was originally created in ancient China, and a Western AI has won that game. How great was the shock of this victory? In this book, the Taiwan-born computer scientist who lives in China, Kai-Fu Lee, gives us a very accurate description.
China’s Sputnik Moment.
Sputnik was the first artificial satellite launched by the Soviet Union in 1957. Its success shocked the U.S. and other Western countries into believing that they would be surpassed by the Soviet Union in the field of science and technology, and it was called the “Sputnik Shock. In the same way, China was also shocked by American AI and came to believe that it must catch up with and surpass the US in the field of AI.
In fact, since 2016, China’s investment in the AI field has been tremendous. For example, in 2018, three-quarters of the global investment in AI research and development were made by Chinese companies, according to the Chinese government, and the world leader in AI has been a group of U.S. tech companies known in Japan as “GAFA,” including Google and Amazon, which owns the company that developed AlphaGo. In recent years, the rise of four Chinese companies, Baidu, Alibaba, Tencent, and Huawei, has been remarkable. The acronym for these four companies is “BATH,” and it has even come to be said that “BATH will be the next GAFA.
In the U.S., on the other hand, GAFA is facing a headwind. Criticism is mounting over the fact that it siphons off huge amounts of personal data from its users and uses it for advertising and marketing, and that this data even influences the presidential election. It has even been called “surveillance capitalism” because it is a business based on data that monitors individuals. Some in the tech industry fear that this headwind will make it more difficult to use personal data in the U.S. in the future, and that it will also affect the evolution of AI.
China’s society is getting better” through AI surveillance
In China, however, such concerns are completely non-existent. On the contrary, the Chinese Communist Party is pushing ahead with the collection of personal data by AI on a state-wide basis. However – and unfortunately this point is not explained in this book – this direction in China should not be totally rejected.
In the 20th century, the Great Leap Forward movement and the Cultural Revolution of Mao Zedong’s era destroyed the social capital of trust between people, and it is said that the tendency of “short-term profit” rather than long-term good human relations became the norm in China. However, with the introduction of AI-based surveillance, trust is being restored.
In China, a “social credit system” is being implemented in which credit is given to each citizen based on the surveillance of the citizen. The more credit you have, the more credit you can get. On the other hand, if the amount of credit decreases, there are punishments such as travel ban or lowering of social status. The idea is that this encourages people to behave in a more disciplined manner and helps to restore trust among individuals.
A Chinese acquaintance of mine, who works for a major Chinese tech company, once told me, “It’s not about human surveillance. It doesn’t bother me because it’s not human monitoring, it’s the AI monitoring. More than that, I feel that Chinese society is clearly getting better. It seems to be criticized by the West, but what is the problem with this system?”
It is certainly a great advantage.
But at the same time, this social credit system is also misused to oppress Uyghurs. And the surveillance against Uyghurs is more thorough. In this book, the details of the surveillance are so detailed that I can only say it is a masterpiece.
Monitoring even the purchase of kitchen knives
In the “predictive policing” described above, facial recognition cameras and infrared cameras with night vision are installed everywhere, including entertainment facilities, supermarkets, schools, and the homes of religious leaders. The system also collects car license numbers and citizen ID numbers from community visitor management systems and local checkpoints, and even has a system that can collect the unique addresses of all computers and smartphones in the area of wireless LANs used in the area. From this vast amount of data, the AI picks up criminal suspects or potential criminals and pushes notifications to security officials. Based on this data, police officers can go door-to-door, restrict movement, and take other measures.
And Uyghur households that “do not meet the trustworthy criteria” are ordered to install surveillance cameras even in their houses. They are forced to buy them at electronics stores with their own money and install them with plastic covers so that they cannot be tampered with or turned off without permission.
What is even more surprising is the fact that they even control the knives sold in daily necessities stores to avoid terrorist attacks with knives. A QR code is used to access the photo, name, address, ethnicity and other data of the person who bought the knife. A device costing hundreds of thousands of yen that can be engraved on the blade of a kitchen knife has been required to be installed even in small stores. This is so thorough that it has already surpassed the setting of the movie “Minority Report.
Blades can be used to kill people, but they can also be used to make great food. While a social trust system based on AI and data surveillance is restoring trust in Chinese society, the same technology is also being used for horrific repression.
There is no such thing as “good technology” or “bad technology. There is no consciousness or ethics in technology, it is how we use it that counts. The reality of Xinjiang, as depicted in this book, brings the issue of this duality into sharp focus in a very real way.