Thanks to @jovialjoy, I& #39;m reading through this #FairFace paper. There& #39;s a lot to unpack but I just want to point out a couple of underlying fundamental problems with respect to the way these authors think about gender. https://openaccess.thecvf.com/content/WACV2021/papers/Karkkainen_FairFace_Face_Attribute_Dataset_for_Balanced_Race_Gender_and_Age_WACV_2021_paper.pdf">https://openaccess.thecvf.com/content/W...
The authors developed a new face image dataset and trained a model that they claim performs better than previous models on & #39;race, gender, and age classification.& #39; However, even putting aside (for now) ethical problems, they made a number of fundamentally incorrect assumptions:
False assumption 1: there are only two genders.
The authors don& #39;t even provide a caveat, such as & #39;for the purposes of this study we reduced gender to a binary.& #39;
I& #39;m not going to waste time here with this one other than to say, look it up. https://en.wikipedia.org/wiki/Gender ">https://en.wikipedia.org/wiki/Gend...
The authors don& #39;t even provide a caveat, such as & #39;for the purposes of this study we reduced gender to a binary.& #39;
I& #39;m not going to waste time here with this one other than to say, look it up. https://en.wikipedia.org/wiki/Gender ">https://en.wikipedia.org/wiki/Gend...
False assumption 2: humans can correctly classify gender from photos (to provide & #39;ground truth& #39; for model testing). They took a & #39;best 2 out of 3 Turker guesses& #39; approach.
False assumption 3: humans can correctly classify gender from photos ACROSS ALL RACES, ETHNICITIES, AGES, & GENDERS with equal error rates.
This is especially problematic since it undermines the whole point of the paper.
This is especially problematic since it undermines the whole point of the paper.
False assumption 4: it& #39;s OK for either humans or machines to classify people& #39;s gender without consent.
It& #39;s not OK to do this. Don& #39;t do it. You need to ask people if you want to know their gender, and get informed consent if you want to use that information to train your model.
It& #39;s not OK to do this. Don& #39;t do it. You need to ask people if you want to know their gender, and get informed consent if you want to use that information to train your model.
False assumption 5: skin = race. It does not. Computer Scientists: Get It Through Your Head. IT DOESN& #39;T. No phenotypical characteristic is a 1-1 match with race because race is not biology. Look it up. https://en.wikipedia.org/wiki/Race_(human_categorization)">https://en.wikipedia.org/wiki/Race...
False assumption 6: "Latino is often treated as an ethnicity, but we consider Latino a race, which can be judged from the facial appearance."
My friends, Latino is not a & #39;race,& #39; ESPECIALLY in this context (computer vision).
Here: https://www.vox.com/2016/8/28/12658908/latino-hispanic-race-ethnicity-explained">https://www.vox.com/2016/8/28...
My friends, Latino is not a & #39;race,& #39; ESPECIALLY in this context (computer vision).
Here: https://www.vox.com/2016/8/28/12658908/latino-hispanic-race-ethnicity-explained">https://www.vox.com/2016/8/28...
If you& #39;d like to learn more about the construction of Latino identity & representation in the U.S. I highly recommend the brilliant @arlenedavila1& #39;s classic text & #39;Latinos, Inc.& #39; https://www.ucpress.edu/book/9780520274693/latinos-inc">https://www.ucpress.edu/book/9780...