Does AI have a ‘white guy’ bias?

Does AI have a ‘white guy’ bias?

Feature articles |
By Peter Clarke

I just read a well-written and interesting NY Times commentary entitled Artificial Intelligence’s White Guy Problem by Kate Crawford. Crawford is a principal researcher at Microsoft and co-chairwoman of a White House symposium on society and artificial intelligence.

The “I” in Artificial Intelligence (AI) relies on inputs from the human beings that create it and teach it. Crawford says that sexism, racism and other forms of discrimination are being built into the machine-learning algorithms that underlie the technology behind many “intelligent” systems that shape how we are categorized and advertised to.

As designers, we all put a bit of ourselves into our designs whether their nature is analog, power or software-related like AI, even if it might be subconsciously. But with software and the learning process for AI, the data that is being fed into the system can be prejudiced, even if not intentionally so.

In many machine-learning systems, an AI learns just like a baby learns, by observing and imitating a chosen type of system behaviour. If human-modulated behavior is part of that system that can introduce bias depending upon the people being watched and their so-called prejudices in the way they perform tasks. In machine vision systems there can be neural networking algorithms that learn by seeing a multitude of images. Humans that select those images can thereby introduce bias that can ultimately prejudice an AI’s decisions.

This basic problem of prejudices is not new. It’s the advanced technology for AI that is new, which magnifies the problem. Designers and programmers need to constantly refine their software algorithms to meet the needs of the service it will perform in an un-biased manner. So a Google autonomous vehicle that hits a bus needs to have its software algorithms modified – this will be an iterative process. We have entered a new realm of engineering with AI and new measures and rules will need to be formed so that these so-called prejudices can be avoided. Even HAL (Heuristically programmed Algorithmic computer) was prejudiced against anyone trying to terminate it or the mission; “I can’t let you do that, Dave.”

As humans, we all have certain prejudices, such as the avoidance of a chatty, loud person or not going to certain restaurants that serve food that we dislike. There are also more serious prejudices against race, creed and color to name a few.

MIT Press has an interesting book entitled The Machine Question by David J. Gunkel

What are your thoughts on this subject?

Steve Taranovich is the editor-in-chief of EE Times’ Planet Analog website where this article first appeared.

Related links and articles:

Four elements added to the periodic table

Moore’s Law: a mixed-signal perspective

Slideshow: Is anybody out there?

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News


Linked Articles