Accurate, Focused Research on Law, Technology and Knowledge Discovery Since 2002

Here’s a Way to Learn if Facial Recognition Systems Used Your Photos

The New York Times – “An online tool targets only a small slice of what’s out there, but may open some eyes to how widely artificial intelligence research fed on personal images. When tech companies created the facial recognition systems that are rapidly remaking government surveillance and chipping away at personal privacy, they may have received help from an unexpected source: your face. Companies, universities and government labs have used millions of images collected from a hodgepodge of online sources to develop the technology. Now, researchers have built an online tool, Exposing.AI, that lets people search many of these image collections for their old photos. The tool, which matches images from the Flickr online photo-sharing service, offers a window onto the vast amounts of data needed to build a wide variety of A.I technologies, from facial recognition to online “chatbots.”

People need to realize that some of their most intimate moments have been weaponized,” said one of its creators, Liz O’Sullivan, the technology director at the Surveillance Technology Oversight Project, a privacy and civil rights group. She helped create Exposing.AI with Adam Harvey, a researcher and artist in Berlin. Systems using artificial intelligence don’t magically become smart. They learn by pinpointing patterns in data generated by humans — photos, voice recordings, books, Wikipedia articles and all sorts of other material. The technology is getting better all the time, but it can learn human biases against women and minorities. People may not know they are contributing to A.I. education. For some, this is a curiosity. For others, it is enormously creepy. And it can be against the law. A 2008 law in Illinois, the Biometric Information Privacy Act, imposes financial penalties if the face scans of residents are used without their consent…”

Sorry, comments are closed for this post.