HUMAN 1
I am assuming that the computer program did not factor in race at all and instead focused on actual circumstances which black people just tend to be worse off in. Then ‘ProPublica’ got outraged when doing simple data mining because working to actually resolve and understand issues has been well outside of the American left’s wheelhouse for decades.
HUMAN 2
The algorithm is literally biased
Yeah, you can say you have a completely race-blind algorithm, but if it’s blind to racism impacting the data, it’s going to have a result that suffers from racism as well.
For example, asking, ‘Was one of your parents ever sent to jail or prison?’ is really closely akin to asking whether someone’s grandparent was a slave before forcing them to pay a poll tax or take a literacy test. The question may not be inherently racist, but the question it’s asking is addressing a reality that was racist and affected people disproportionately.
If you’re white, your parents are less likely to have been arrested by police for smoking weed in the 1970s. If you’re white, you’re more likely to have gotten off with a warning when you got in a fight in high school than prosecuted for a felony. If you’re white, you probably have a social network that can provide you with a job more easily because your family, friends, neighborhood, and classmates were allowed to inherit and increase their wealth.
An algorithm that perpetuates systemic biases probably is not a well-designed one. Continue reading “‘Why do liberals hate facts?’”