Estimated reading time: 3 minutes, 1 second

How Can Bias be eliminated from AI Systems? Featured

How Can Bias be eliminated from AI Systems? green and white typewriter on black textile

 Over the years, human biases have been laid bare, and most of them are well-documented. From the field experiments that demonstrate human biases to implicit association tests that show some of the prejudices that we are not even aware of, humans are irredeemably prejudiced, and that is not new to anybody. As artificial intelligence continues developing at unprecedented levels, it is emerging that human biases have made their way to AI systems- leading to harmful results. 

At such a time where many organizations are seeking ways of deploying AI systems across their enterprises, knowing the risks and looking for ways of reducing them should be an urgent priority. There have been many instances where AI systems have shown bias. One such scenario was during a test in Broward County in Florida, where an AI system gave higher risk scores to African Americans compared to the whites. This was the case even in situations where the whites already had existing criminal records.

The list of biases shown by AI systems is long. GPT, a natural language processor that is widely known for its capability to write amazing essays, is among the biased systems. It has been found out to be producing racist and with sexist completions when they are given prompts about minorities. Similarly, Amazon’s AI-based automated hiring system was found to be biased towards female candidates. Other commercial software and face recognition technologies have also shown biases and increased error rates towards minorities, mainly people of African descent.

While the biases in AI can be a bad thing, there can be a silver lining. AI can help to expose the truth about how messy the data sets are. In the process, it helps to understand better the biases that have not been isolated already through algorithms. With the right algorithms and machine learning aptitude to identify anomalies, human data biases can be exposed if outputs are considered rationally. The ability to track down the biases in data is increasing the urge for the designers to combat societal prejudices that enter AI systems.

Curating data is the leading area where designers and data scientists are now betting on to eliminate biases in systems. In western nations that have massive digital footprints, the available data is asymmetric and lacks diversity. These inequalities in the available data lead to a lack of fairness. Curators’ main aim is to track such issues and eliminate prejudices associated with aspects such as race and gender, among others.

Another problem that comes with using readily available data to train AI systems is that it already has biases. For example, the data from anonymous postings from the user-generated text, that is posted on forums, often have many prejudices that make their way into language generation systems. This is evidence of the extent of biased text completions in some state-of-the-art language generation systems. Therefore, training data must be vetted carefully to reduce bias and enhance quality. Additionally, uncurated data come with costs, and those using it must be ready to bear these costs if they decide to use it in their AI systems.

As artificial intelligence continues gaining traction in different industries and its deployment rise, there should be increased awareness of this technology among the individuals, companies, society, and governments on the impacts of this technology. Although there is no single method that can be used to remove bias from AI systems, the only way is to ensure that learning data is free from bias as much as possible despite the challenges that are associated with it. If correctly done, such fairness of data can change our society for the best.

Read 3868 times
Rate this item
(0 votes)
Scott Koegler

Scott Koegler is Executive Editor for PMG360. He is a technology writer and editor with 20+ years experience delivering high value content to readers and publishers. 

Find his portfolio here and his personal bio here

scottkoegler.me/

Visit other PMG Sites:

We use cookies on our website. Some of them are essential for the operation of the site, while others help us to improve this site and the user experience (tracking cookies). You can decide for yourself whether you want to allow cookies or not. Please note that if you reject them, you may not be able to use all the functionalities of the site.