Monkey Holding Box ? Google Confused Black Youngster With Monkey

Google has come under fire recently after a picture of a young black boy was mistakenly identified as a ‘monkey holding a box’ in its image search tool. The incident has sparked outrage and criticism from all corners of the Internet, with many pointing to this as an example of how Google’s algorithms can be biased. In this blog post, we will take a closer look at this issue and examine the implications it has for Google, the tech industry and society. We will also explore potential ways to prevent similar incidents from happening in the future, as well as discuss what people can do on their end to fight bias and discrimination in technology.

Google’s new algorithm confuses black youngster with monkey

It’s no secret that Google’s algorithms are far from perfect. The company has come under fire in the past for a number of algorithm-related issues, including racist search results and offensive autocomplete suggestions.

Now, it seems that the company’s algorithms are once again causing problems, this time by confusing a black youngster with a monkey.

The implications of this error

The implications of this error are far-reaching. For one, it perpetuates the stereotype that black people are monkeys. This is not only offensive, but it also dehumanizes black people and makes them seem like animals. Additionally, this error reinforces the notion that Google is a biased and unreliable source of information. If Google can’t even get basic facts right, how can we trust them to provide accurate information on more complex topics? This error also calls into question the algorithms that Google uses to surface content. If their algorithms are biased or inaccurate, it could have serious consequences for our society as a whole.


In conclusion, the image of a monkey holding a box was mistakenly identified by Google as being that of a black youngster. This error has highlighted both the lack of diversity in facial recognition technology and the need for more accurate algorithms to be developed. As such, it is essential that companies invest in creating solutions which are not only designed to accurately identify different types of people but also consider cultural and racial sensitivities when doing so.


No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *