
Google’s automatic image recognition software has apparently picked up photographs of two black friends, and marked them as subhuman.
New Yorker Jacky Alcine was going through his pictures when he noticed an album titled ‘Gorillas’.
But when he opened it up all he found were images of himself and a friend.
Google official Yonatan Zunger quickly apologised on Twitter, saying: ‘Sheesh. High on my list of bugs you *never* want to see happen. ::shudder::’
He added the team were trying to fix the bug.
Programmers are working on longer-term fixes regarding words to be careful about in photos of people, he said, adding Google would also try to improve recognition of ‘dark-skinned faces’.
The point of the software is to save you the trouble of organising similar photos together, so for example you could easily find all birthday pictures in one place.
But someone definitely needs to sort this out.
Google said in a statement:
We’re appalled and genuinely sorry that this happened. We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future.
Source
This is horrible and they should have tested this software for any *bugs* before releasing it.
Your thoughts?