by veen
Big data claims to be neutral. It isn’t.
Approached without care, data mining can reproduce existing patterns of discrimination, inherit the prejudice of prior decision-makers, or simply reflect the widespread biases that persist in society. It can even have the perverse result of exacerbating existing inequalities by suggesting that historically disadvantaged groups actually deserve less favorable treatment. Algorithmic decision procedures could exhibit these tendencies even if they have not been hand-coded to do so, either by design or by accident. Scholars and policymakers have tended to worry that the inscrutability of algorithms will keep these intentions or mistakes hidden, but discrimination may be an artifact of the data mining process itself, rather than a result of programmers assigning certain factors inappropriate weight.|Interesting (free) article on how big data and data mining can be inherently bad at judging people, particularly in law.