A big biometric security company in the UK, Facewatch, is in hot water after their facial recognition system caused a major snafu - the system wrongly identified a 19-year-old girl as a shoplifter.

  • CeeBee@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    No they aren’t. This is the narrative that keeps getting repeated over and over. And the citation for it is usually the ACLU’s test on Amazon’s Rekognition system, which was deliberately flawed to produce this exact outcome (people years later still saying the same thing).

    The top FR systems have no issues with any skin tones or connections.

      • CeeBee@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 month ago

        I promise I’m more aware of all the studies, technologies, and companies involved. I worked in the industry for many years.

        The technical studies you’re referring to show that the difference between a white man and a black woman (usually polar opposite in terms of results) is around 0.000001% error rate. But this usually gets blown out of proportion by media outlets.

        If you have white men at 0.000001% error rate and black women at 0.000002% error rate, then what gets reported is “facial recognition for black women is 2 times worse than for white men”.

        It’s technically true, but in practice it’s a misleading and disingenuous statement.

        Edit: here’s the actual technical report if anyone is interested

        https://pages.nist.gov/frvt/reports/1N/frvt_1N_report.pdf