After a study found that artificial intelligence could be successful in predicting a person’s political orientation based on images of neutral faces, researchers say that facial recognition technology is “more than previously thought possible.” It warns that it poses a “serious challenge to privacy”.
a Recently published research The American Psychologist says that an algorithm’s ability to accurately infer a person’s political views is “on par with the accuracy with which it predicts job success in a job interview or with alcohol’s ability to increase aggression.”
Lead author Michal Kosinski told Fox News Digital that 591 participants were tested before the AI captured what he called a numerical “fingerprint” on their faces and compared it to a database of responses to predict opinions. said they completed a questionnaire regarding political orientation.
“I don’t think people realize how exposed they are just by putting a photo out there,” said Kosinski, an associate professor of organizational behavior at Stanford University’s Graduate School of Business.
“We know that people’s sexual orientation, political orientation, and religious views should be protected. It used to be different. In the past, when you typed in someone’s Facebook account, you could, for example, You could see people’s political views, likes, pages they follow, and more. But years ago, it was clear to policymakers, Facebook, and journalists that it was unacceptable, so Facebook shut it down. It’s too dangerous,” he continued.
“But if you go on Facebook, anyone can see your photos. This person has never met you, never allowed you to see their photos, and never shared their political orientation.” And what our research shows is that this is essentially, in some ways, just their political orientation. We tell them what sexuality is,” Kosinski added.
In this study, images of participants were collected in a highly controlled manner, the authors said.
“Participants wore black T-shirts that were adjusted to cover their clothing using binder clips. They removed all jewelry and shaved facial hair if necessary. Fresh wipes Makeup was removed using face wipes until no residue was detected on the skin. Hair was pulled back using hair ties, bobby pins, and headbands, being careful not to blow the hair away. they wrote.
NurPhoto (from Getty Images)
Tipapa Pat – Stock.adobe.com
The facial recognition algorithm VGGFace2 then examined the images and determined “facial descriptors, or numeric vectors, that are unique to that individual and consistent across different images,” it said.
“The descriptors extracted from a particular image are compared with the descriptors stored in the database. If they are similar enough, the faces are considered a match. Here we use linear regression. “We map facial descriptors on a political orientation scale and use this mapping to predict the political orientation of never-before-seen faces,” the study states.
The authors said the findings “underscore the urgency for academics, the public, and policymakers to recognize and address the potential risks of facial recognition technology to individual privacy,” adding that “politically oriented Analyzes of facial features associated with… reveal that conservatives tend to: The lower face is large. ”
“Perhaps most importantly, our findings suggest that widespread biometric surveillance technology is more of a threat than previously thought,” the study warns. “Previous research has shown that natural facial images convey information about political orientation and other intimate characteristics. But whether that prediction was made possible by self-presentation or whether stable facial images It was unclear whether this was enabled by features or by both. Our results suggest that stable facial features convey a significant amount of signals that individuals perceive their privacy. It suggests that you don’t have much control.”
Kosinski told Fox News Digital that “algorithms can be applied very easily, quickly and inexpensively to millions of people,” adding that the research is “very widely used in mobile phones and everywhere.” “It’s more of a cautionary tale” about the technology that is being used, he said. ”
The authors write that “even rough estimates of people’s personality traits can greatly improve the efficiency of online mass persuasion campaigns,” and that “academics, the public, and policy makers should pay attention and “Strengthening policies regulating image recording and processing should be considered.”





