Question: Facial analysis system errors. A study conducted by researchers at MIT and the University of Toronto found that Amazons facial analysis software Rekognition, used by

Facial analysis system errors. A study conducted by researchers at MIT and the University of Toronto found that Amazon’s facial analysis software Rekognition, used by some local governments for tracking down suspects, routinely mislabeled dark-skinned women as men 31% of the time, and routinely mislabeled light-skinned women as men 7% of the time. Suppose that the software processes a light-skinned woman and a dark-skinned woman.

(https://www.cnet.com/news/amazons-facial-tech-showsgender-racial-bias-mit-study-says/)

(a) What is the probability that at least one is mislabeled as a man?

(b) What is the probability that both are mislabeled?

(c) Given that at least one is mislabeled as a man, what’s the probability that the dark-skinned woman is mislabeled?

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Inferential Statistics Questions!