• TAG@lemmy.world
    link
    fedilink
    arrow-up
    73
    ·
    6 days ago

    That reminds me of the time, quite a few years ago, Amazon tried to automate resume screening. They trained a machine learning model with anonymized resumes and whether the candidate was hired. Then they looked at what the AI was looking at. The model had trained itself on how to reject women.

    • merc@sh.itjust.works
      link
      fedilink
      arrow-up
      7
      ·
      5 days ago

      Another similar “shortcut” I’ve heard about was that a system that analyzed job performance determined that the two key factors were being named “Jared” and playing lacrosse in high school.

      And, these are the easy-to-figure-out ones we know about.

      If the bias is more complicated, it might never be spotted.