AI systems are trained on data, and if that data reflects human biases, the AI will repeat or amplify them.

  • Resume screening AIs have shown bias against women

  • Predictive policing tools have disproportionately flagged minority communities

  • Image generators may reproduce cultural stereotypes

We must ensure fairness by questioning how AI makes decisions.