Slate – Our research shows that any one of us might end up helping the facial recognition industry, perhaps during moments of extraordinary vulnerability. “If you thought IBM using “quietly scraped” Flickr images to train facial recognition systems was bad, it gets worse. Our research, which will be reviewed for publication this summer, indicates that the U.S. government, researchers, and corporations have used images of immigrants, abused children, and dead people to test their facial recognition systems, all without consent. The very group the U.S. government has tasked with regulating the facial recognition industry is perhaps the worst offender when it comes to using images sourced without the knowledge of the people in the photographs. The National Institute of Standards and Technology, a part of the U.S. Department of Commerce, maintains the Facial Recognition Verification Testing program, the gold standard test for facial recognition technology. This program helps software companies, researchers, and designers evaluate the accuracy of their facial recognition programs by running their software through a series of challenges against large groups of images (data sets) that contain faces from various angles and in various lighting conditions. NIST has multiple data sets, each with a name identifying its provenance, so that tests can include people of various ages and in different settings…”
Sorry, comments are closed for this post.