(Almost) no rooms are without cameras in detention camps
Technological companies are complicit in growing China’s surveillance machine.
China’s camera monitoring and facial detection of Uyghurs has been growing in scale. Most companies that are contributing to the rising power of state surveillance are escaping repercussions.
As a recent article in The Guardian explains, China’s monitoring of religious activity of Uyghurs and other ethnic minorities has been steadily growing over the last 20 years.
Last month, a non-governmental tribunal was organized in London by a group of lawyers, professors and advocacy groups such as the World Uyghur Congress, to bring attention to the treatment of Uyghurs in China.
According to testimony of Abdusalam Muhammad, one of many survivors of Chinese detention and re-education camps, surveillance apparatus that focuses on Uyghurs and other ethnic minorities has became more targeted as a direct consequence of technological advancements.
Pervasive camera surveillance and facial recognition are making detention and arrests inescapable, while “the goal of the surveillance is not only to instill fear in Uyghurs, but also to single them out for detention in the internment camp system,” as Dolkun Isaa, president of the World Uyghur Congress advocacy group, said.
Baqitali Nur, another former detainee, testified that the detention camp was covered in cameras. “The only camera-free place was where the toilet was.”
Technological capabilities of cameras have been expanding too. Not only constant monitoring but also facial recognition capacities have been growing. Its proliferation is a direct “outcome of state policy”, as Conor Healy of security research group IPVM stated. China listed an option for ethnicity recognition between Uyghurs and Non-Uyghurs “as one of the requirements of facial recognition systems that the government implements,” Guardian claims.
There were 11 companies reported at the hearings of the World Uyghur Congress that are potentially playing a role in China’s surveillance practices. However, for most of those companies nothing changes too much from the business perspective, and some are even successfully evading a proper scrutiny.
How does the continuous screening, capturing and digitisation of our faces, voices and emotions impact how we value them? What are the socio-political consequences of using algorithms that reduce your face to a digital barcode, and that make assumptions about your identity based on how you look? Is it time to reclaim our faces?
Photo Credits:Pieter Kers|beeld.nu