looks at non-podcasting couple “so which one of you does the painstaking research into a widely misunderstood historical event, and which one of you reacts to hearing about it in real time?”
This is a great example of how AI automates bias. It’s already well known that doctors tend to downplay women’s medical concerns as having an emotional basis.
AI is only as good as the data that trained it. Thus the chatbot simply reflects the gender bias of the typical doctor. [twitter.com]
High on the list of weird Apple keynote brags is changing the font in the camera app. It even made the marketing website!