Loading...
Loading...
Understanding AI bias — where it comes from, how to detect it, and building fairer AI systems
Internet data over-represents certain groups and perspectives. If 80% of 'doctor' mentions use 'he', the model learns that association.
Human annotators bring their own biases to what they consider appropriate or high-quality.
Optimizing for overall accuracy may hide disparities across demographic groups.
| Approach | When | Example |
|---|---|---|
| Data debiasing | Dataset creation | Balance representation |
| Constitutional AI | Training | Add anti-bias principles |
| Post-processing | After deployment | Ensure demographic parity |
| Human review | Deployment | Flag high-stakes decisions |
Design a bias testing protocol for a resume screening AI. What would you test and how?