A new paper argues that political orientation can be correctly classified with 72% accuracy using facial recognition technology. The paper begins with considerations about how “facial recognition can be used without subjects’ consent or knowledge”, which is true, but I am confident we do not need to be concerned about being able to predict people’s political orientation using facial recognition technology. At least not based upon the methodology and findings presented in the paper in question.
Specifically, the fact that the classifier in the study was able to correctly “predict” political orientation with 72% accuracy is not the same as the probability of correctly predicting the political orientation of a person when presented with a picture (or even multiple pictures) of that person.
The article is aware of some limitations of the approach, but only to conclude that “the accuracy of the face-based algorithm could have been higher” (i.e., what we are looking at are lower-bound estimates!). That is what you often see in scientific papers (limitations are often presented as humblebragging), because the more serious limitations would decrease the likelihood of the paper being published (even in an outlet such as Scientific Reports).
To understand why the accuracy in “real life” is less likely to be as high as 72%, we need to keep in mind that it is not a random sample of pictures we are looking at, and this will significantly bias the task at hand to provide more accurate predictions. First, not all people are liberals or conservative. If we had to add a third category (such as “Centrist”), the accuracy would decrease. In other words, the classification task does not reflect the real challenge if we are to use facial recognition technology to predict political affiliations.
Second, not everybody want to declare their political orientation and only people who did so are included in the study. The study relies on data from Facebook and dating websites. You will most likely have less of an issue with people being able to predict your political orientation if you, in the first place, are happy with publicly providing information about your political orientation. Accordingly, even if the estimate provided in the paper is realistic, I would definitely see it as an upper-bound estimate.
For the dating websites, more than half of the sample selected “Green”, “Libertarian”, “Other”, “Centrist” or “don’t know”. By only including the people who explicitly selected liberal or conservative political orientations (i.e., less than half of the sample), we are making the task a lot easier. The problem, or rather the good thing, is that people in real life do not only fit into these two categories. All of these studies on facial recognition tecnology do not deal with these issues because it would make them a lot less important.
For the Facebook data, it is even more interesting to look at the data. The study describes how a face is a better predictor of political orientation than a 100-item personality questionnaire. Here is the twist: To measure political orientation, two items from this questionnaire were used. Accordingly, it is actually only a 98-item personality questionnaire. With this information in mind, take a look at the following interpretation provided in the paper:
a single facial image reveals more about a person’s political orientation than their responses to a fairly long personality questionnaire, including many items ostensibly related to political orientation (e.g., “I treat all people equally” or “I believe that too much tax money goes to support artists”).
So an image of a person is better able to predict the answers to two questions in the 100-item International Personality Item Pool than the 98 other questions? I don’t see this as convincing evidence. It is not a feature – it is a bug. Again, in this sample, some participants were also excluded (although it is not easy to get a sense of how many are actually excluded).
The study concludes that given “the widespread use of facial recognition, our findings have critical implications for the protection of privacy and civil liberties.” It is great that people care about privacy and civil liberties (we should all care more about such topics!), but there is nothing in the study that makes me concerned about the ability for facial recognition technologies to successfully predict political orientation.