Chinese Students Invent Coat That Makes People Invisible to AI Security Cameras

“AI-powered security cameras are everywhere. They are everywhere, and they reveal our privacy, a researcher told VICE World News.
It seems identical to other coats with camouflage patterns to the unaided eye. However, it functions as an invisibility cloak for surveillance cameras with artificial intelligence, effectively hiding the wearer.



The coat’s unique camouflage patterns, created using an algorithm, are invisible to cameras that use visible light throughout the day. The coat’s inbuilt thermal devices emit various temperatures at night, when security cameras typically use infrared thermal imaging to identify people, creating a distinctive heat pattern that enables the coat to evade detection.

The InvisDefense coat, created by a team of four graduate students from Wuhan University in China, was one of the proposals that won the first place in the first “Huawei Cup,” a cybersecurity innovation competition sponsored by Chinese tech giant Huawei.

Wei Hui, a PhD student in computer science who created the coat’s fundamental algorithm, told VICE World News that “We spent a lot of work” preparing for this. According to him, the InvisDefense coat offers a “unique” technique to get around the AI human detecting technology that is employed by current security cameras.
The accuracy of pedestrian identification was lowered by 57% when the students tested the coat on campus security cameras. Finding a balance between deceiving the camera and the human sight was claimed to be one of the coat’s primary development challenges.



Wei stated, “We had to utilize an algorithm to generate a least noticeable picture that may disable camera vision.”

China has a well-known cutting-edge governmental surveillance apparatus that is known to invade citizens’ privacy and target the political opponents of the regime. Eight of the top ten most watched cities in the world were located in this nation in 2019. Today, both the government and businesses utilize AI identification tools, from identifying “suspect” Muslims in Xinjiang to deterring kids from playing video games late at night.
There has been some opposition, but there hasn’t been much. In the nation’s first-ever case contesting the use of facial recognition technology, a law professor in 2020 successfully sued a zoo in Hangzhou for gathering visitors’ facial biometric data without their agreement.

The Wuhan University researchers considered similar privacy issues while creating the InvisDefense coat, which will cost roughly 500 yuan ($71).

“AI-powered security cameras are everywhere. Our lives are affected by them, said Wei. “Machine vision exposes our privacy.”

“We created this device to block harmful detection and, in some cases, to safeguard people’s safety and privacy.”
The team’s future research ambitions, according to Wei, include making inanimate things and moving automobiles “invisible” to AI cameras. They are also investigating ways to get around cameras that rely on remote sensing, satellites, or airplanes.

The researchers, who are Chinese citizens, don’t seem to be trying to undermine the state’s extensive monitoring apparatus, though. In fact, the team claims that they want to make it stronger.

Wei asserted that security cameras are ineffective because they cannot pick up the InvisDefense coat. Because we are essentially identifying weaknesses in existing machine vision technology, we are also working on this project to encourage its development.



Leave a Reply

Your email address will not be published. Required fields are marked *