The murder of George Floyd brought to the world stage the long held bias and unethical treatment of black people and minority groups in America and around the world. Governments, individuals and organizations collectively began to stop and consider how they can repair the relationships between them and the people that they have oppressed. And while there is a long, LONG way to go, I am so proud that society has finally begun this work in earnest. Not just giving lip service and bandwagoning the latest social trend, but truly considering how real change can be made.
In the tech sector this kind of self reflection is just as important. Are we building bias, racist machines? The short answer is unfortunately, yes. As technologists we create algorithms that determine everything from the minute aspects of people’s lives to the life-altering. And many organizations do not have processes in place to ensure that they have taken this kind of algorithmic bias into account. However taking steps to prevent bias in current and future AI systems is just one way we can ensure equity for all now and into the future.
Here are just a few ways your organization can begin the process of ensuring best practices in preventing algorithmic bias:
Understand the problem. No, REALLY understand the problem
Step back and look at the question that is being framed and if it truly needs an automated solution. How has this question been addressed in the past and what issues arose then? You likely won’t fix every issue simply through automation and particularly if you are using training data that is based on the problematic decisions from the past. When researchers wanted to create an algorithm that looked at recidivism in prisons the automated system both reproduced and exacerbated racial and socioeconomic bias. The system was more likely to give higher recidivism rates to black and hispanic prisoners despite research showing that race has no bearing on recidivism. Additionally, the algorithm was less likely to recommend women for good behavior and parole. The system that was supposed to create fair and equitable judgement was created using the unfair and biased human decisions of the past. In order to create a fair and balanced system, technologists need to weigh multiple political, historical and economic factors. This should not be the job of the technologists alone, business and leadership roles need to play a part in the process.
Make sure your data set is representative
Although this appears easy, representation needs to be thoughtfully considered. Does the audience (both the intended and unintended audiences) have representation in your training data set at the correct proportions? Even with a representative dataset models can have a greater error rate with some groups. A large technology company’s facial recognition system was trained with nearly 50-50 male to female participants, using faces and skin tones that covered the entire fitzpatrick scale with reasonable proportion to the population. However after testing the company saw that even with a representative dataset the algorithm was much more likely to classify women with darker skin tones incorrectly. Really consider if your dataset is representative and remember that testing for these unforeseen biases is key.
Have the hard discussions
Talk about the historical inequities that have been seen in your industry in years past. Consider testing around bias to see if the training set is misrepresentative or the algorithm is inadvertently recreating the poor decisions of the past. Talk with your data team about how bias can be mitigated, create processes to help inform and regulate data use within your company and have the hard discussions about where your blindspots are and what groups they affect. These conversations should not be left to the technology teams alone, they should be a part of the company culture at all levels.
Bias is everywhere and no one is immune to it. The systems that drive our businesses, government and organizations are just as fallible as the people who build them. However all organizations should strive to take into consideration a broad view of our world. Doing so will not only promote practices that are equitable and beneficial to society but are also good for business!