Question: Facial analysis system errors, II A study conducted by researchers at MIT and the University of Toronto found that Amazons facial analysis software Rekognition, used
Facial analysis system errors, II A study conducted by researchers at MIT and the University of Toronto found that Amazon’s facial analysis software Rekognition, used by some local governments for tracking down suspects, routinely mislabeled dark-skinned men as women 1% of the time, and almost never mislabeled light-skinned men (0%). Suppose that the software processes a light-skinned man and a dark-skinned man.
(a) What is the probability that at least one is mislabeled as a woman?
(b) What is the probability that both are mislabeled?
(c) Given that at least one is mislabeled as a woman, what’s the probability that the dark-skinned man is mislabeled?
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
