A chatbot turns hostile[1]. A test version of a Roomba vacuum[2] collects images of users in private situations. A Black woman is falsely identified as a suspect[3] on the basis of facial recognition software, which tends to be less accurate at identifying women and people of color[4].
These incidents are not just glitches, but examples of...
Read more https://theconversation.com/are-tomorrows-engineers-ready-to-face-ais-ethical-challenges-213826