I have chosen to connect the topics of algorithm bias, predictive policing, and diversity in games. At first, these topics may not seem related but in reality, big data is used in predictive policing which is a result of societies perception about certain races which can be seen by the lack of diversity among video game characters.
Big data is used by all major corporations. It is a tool that can give them information about the market without actually surveying every single person that walks through the door. While this may seem like a harmless piece of information, it has damaging repercussions. Big data sets that are collected from a small representation of a community leads to algorithm bias. Mathematical algorithms are also used to find conclusions about people but using artificial intelligence generates bias based of off preconceived notions about a certain group of people. This kind of bias can be seen on major platforms like Flickr and Google. When looking at pictures of black men, it was sometimes that case that the automatic tagged setting would identify the men as “animals” or “apes” (Kircher). Similarly, when searching through Google and coming across a black-sounding name, it was “accompanied by ads about criminal activity” more so than a search for white-sounding names (Kircher). This leads into predictive policing where authority figures use predictive algorithms about big data sets to stop problems before they even happen.
Predictive policing is a very new and modern take on policy duties. Because of social media, they are able to use a person’s platform to determine if they are potentially threatening. At a glance, this seems like a great way to stop dangerous people from causing harm. Especially since many extremists take to the internet to boast about their disgusting triumphs. However, there is bias against minority groups as a result of inherent racism. Police are asked to make assumptions about people based off what they find “problematic”. Sometimes, we see white men explicitly talk about the harm they are going to cause people with no police action, but when a middle-eastern man talks about his religion then he’s instantly flagged and watched by government authorities. I will acknowledge that these are very extreme answers, but my point still stands.
Big data/ algorithm bias and predictive policing are all a result of racism and ignorance about minority groups. This is very clearly reflected in the gaming industry. The lack of racial diversity shows that they believe all video gamers are white. It is very rare to see a black-man that isn’t made to look like a criminal, or an Asian man that isn’t small and nerdy. Most video game characters are either white men or white women. This is structured racism that tells all groups of people that they do not matter or that they are lesser than the white men. Diversity among games also plays intro sexist tropes. The few female characters created are hypersexualized and considered the weaker character. Again, this lack of diversity directly relates to society perception about race and about gender.
As a society we need to do better about the racist intentions created by big companies and the corporations that are taking part in these actions need to be held accountable. There needs to be a deeper investigation about the mathematical algorithms being used to sort through people, those police officials using social media to predict hazards need to be devoid of ignorance, and video games need to be more inclusive for all people, not just white men. These are all different factions of life that are caused by the structural racism but can be combated with inclusion.
Works Cited
Kirchner, Lauren. “When Big Data Becomes Bad Data.” ProPublica, 9 Mar. 2019, http://www.propublica.org/article/when-big-data-becomes-bad-data.
file:///C:/Users/emily/Desktop/DTC%20206_Digital%20Reflection_Poster.pdf
If the link doesn’t work, I attached a screenshot of the poster but the resolution is really low.
