Bias-Audit-Report-SA

Bias Audit Report: Employment Prediction in South Africa

Project Overview

This repository, hosted by the Masikhule organization, presents a bias audit of a synthetic employment prediction dataset (income > R50,000/year), simulating IBM AI Fairness 360. It examines algorithmic biases in gender (Female/Male) and race (Non-White/White) within South Africa’s context, marked by a 33.2% unemployment rate [5], a Gini coefficient of 0.63 [0], and historical inequalities from apartheid. Aligned with the Employment Equity Act and BEE goals [6], the audit evaluates fairness metrics (Disparate Impact, Equal Opportunity Difference, Equalized Odds) and applies mitigations (preprocessing, reweighing) to promote equitable AI-driven hiring.

Team Members

Objectives

Methodology

Using a synthetic dataset (80% non-White [0]), we trained a logistic regression model to predict employment outcomes. Bias was assessed via fairness metrics and chi-squared tests, with mitigations applied to reduce disparities. Visualizations (e.g., accuracy vs. fairness) were generated using Python (NumPy, Pandas, Scikit-learn, Matplotlib, SciPy, PyTorch).

Key Findings

Deliverables

Usage

  1. Run the Notebook:
    • Download Bias_Audit_Report_SA.ipynb.
    • Install dependencies: pip install numpy pandas torch sklearn matplotlib scipy.
    • Open in Jupyter Notebook or Google Colab.
    • Run all cells to reproduce analysis and visualizations.
    • Alternatively, view statically via nbviewer.
  2. Download Bias_Audit_Report_SA.pdf for a static version.
  3. Review Presentation_Slides.pdf and Ethics_Statement.pdf.

License

MIT License .

References