top of page
Search

Reflection: Ethics In People Analytics

When assessing ethics, I would consider two factors: intent and impact. Predictive analytics models that are affirmatively designed to be per se discriminatory or will produce results that will be used in an intentionally malicious manner can and should be labeled as unethical. The example in Eric Siegel’s article about facial recognition software being used to identify Uighur Muslims in China is unethical in both design and application. It is based on racist beliefs and used to perpetuate the systemic oppression and increasing human rights violations against members of an entire ethnicity. Siegel’s article vastly understates the threat the Uighurs are under; over 1 million are being held in camps and Uighur women are being subject to forced sterilization (BBC News, 2020).

Less obvious, but perhaps more dangerous and insidious, are systems built in good faith, but that result in adverse impact. Because analytics models are only as good as the algorithm and input data, predictive analytics can perpetuate biases of the designers or users. This can be mitigated through measures such as broad user testing, inclusion in the design process, and a commitment to System 2 Thinking. All of the above require transparency; analytics models and machine learning cannot self-correct so there must be opportunities for humans to review data with a critical eye and not just accept the output as fait accompli.


Where transparency is lacking the opportunity to course-correct can be lost until it’s too late. A recent example is the 2020 A-Levels results day in the United Kingdom. A-levels are subject-specific exams taken by millions of students annually in consideration for university admission. Due to the Coronavirus pandemic, many students were unable to sit or exams in 2020, and as much as 60 percent of the grades were determined by statistical modeling (Katwala, 2020).

The black box model uses factors including the teacher’s predicted score, scores from mock exams that were not given under consistent conditions, and the school’s historical performance; many of these data points are subject to bias or completely outside of the student’s control, and there is no transparency regarding how heavily each data point will be weighted. Although there was an appeal process by which students could select from the teacher assessment alone or choose to sit for the exam in the autumn, both options are fraught with procedural and unconscious bias issues. With upwards of 5 million students relying on these scores to apply to top-level universities, the economic impact may be dire and long-reaching.


To design a statistical model using flawed and inconsistent data, then purposely obfuscate the final design of that model is beyond unethical, it is amoral approaching criminal.



References

BBC News. (2020, July 20). The Uighurs and the Chinese state: A long history of discord. https://www.bbc.com/news/world-asia-china-22278037

Katwala, A. (2020, August 17). Results day is a diversity disaster. Here’s all the proof you need. WIRED UK. https://www.wired.co.uk/article/results-day-exams-bias

16 views0 comments

Recent Posts

See All
bottom of page