115 - Machine Bias by You Are Not So Smart published on 2017-11-20T06:15:20Z We've transferred our biases to artificial intelligence, and now those machine minds are creating the futures they predict. But there's a way to stop it. In this episode we explore how machine learning is biased, sexist, racist, and prejudiced all around, and we meet the people who can explain why, and are going to try and fix it. Genre Science Comment by Kyle Witherrite @michaelmacphail: He needs to check his phone's privilege. 2017-12-26T14:34:41Z Comment by Michael MacPhail Mind actually suggested he after "the nurse said” 2017-11-28T01:37:45Z Comment by Allan McPherson @youarenotsosmart: USER#555 makes a good arguement (by accident or otherwise) for non-gendered pronouns 2017-11-27T23:17:03Z Comment by You Are Not So Smart @youarenotsosmart: Right, but for the rest of the episode we explore why this is a problem when we allow these machines to guide our assumptions about the future. 2017-11-21T19:43:04Z Comment by USER#555 9.6% of nurses in the US are male, 32.4% of doctors in the US are female, 6.61% of pilots are female. Your machine isn't bias, it's just erring on the side of reality. 2017-11-21T16:23:14Z