User interaction patterns can indicate conscious and unconscious biases, for eg. a job recruiter viewing only male candidate profiles on LinkedIn. The group developed models to detect and measure bias through many such interaction patterns. Now, as a designer, how can we take those bias metrics and make them usable in a visual analytics system? Specifically,
How do we design an interface that optimally mitigates the captured bias?
We derived a design space that has 8 dimensions that can be manipulated to impact a user’s cognitive and analytic processes.
I designed an example system called fetch.data to illustrate potential bias mitigation strategies. It is a hypothetical dashboard for job recruiters to screen candidate resumes for hiring.
Intervention strategies for bias mitigation:
1. Where and when do we present the bias metrics? [B.2: on-demand view of bias metrics]
2. How do we promote user awareness in order to reflect on their interactions? [A.1: increase the size of the data points that have not been interacted with]
3. How can we change the direction of their interaction upon detecting bias? [B.1: if recruiter interacts with only one gender, disable the gender filter]
4. What should be the degree of guidance to mitigate bias? [D: recommend candidates that have been excluded]
5. How do we collect user feedback to inform the model? [E: pop-up allowing recruiter to provide feedback while dismissing a notification]