A big biometric security company in the UK, Facewatch, is in hot water after their facial recognition system caused a major snafu - the system wrongly identified a 19-year-old girl as a shoplifter.

  • Hacksaw@lemmy.ca
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 months ago

    Stores in most developed countries, UK included, can refuse service only for legitimate reasons, and they have to do so uniformly based on fair and unbiased rules. If they don’t, they’re at risk of an unlawful discrimination suite.

    https://www.milnerslaw.co.uk/can-i-choose-my-customers-the-right-to-refuse-service-in-uk-law

    She didn’t do anything that would be considered a “legitimate reason”, and although applied uniformly, it’s difficult to prove that an AI model doesn’t discriminate against protected groups. Especially with so many studies showing the opposite.

    I think she has as much standing as anyone to sue for discrimination. There was no legitimate reason to refuse service, AI models famously discriminate against women and minorities, especially when it comes to “lower class” criminal behavior like shoplifting.

    • SSTF@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      I am waiting to follow the case for updates, because while I hope that the outcome pushes back on AI system like this, I am skeptical of current laws to perceive what is happening as protected class discrimination. I presume in the UK the burden for proving fault in the AI lays on the plaintiff, which is at the heart of if the reason is legitimate in the eyes of the law.