FREN

#FF00AA


11 sep. 2019

@Carnage4Life

This is a great example of how AI automates bias. It’s already well known that doctors tend to downplay women’s medical concerns as having an emotional basis.

AI is only as good as the data that trained it. Thus the chatbot simply reflects the gender bias of the typical doctor. [twitter.com]

@DrMurphy11

The @babylonhealth Chatbot has descended to a whole new level of incompetence, with #DeathByChatbot #GenderBias .

Classic #HeartAttack symptoms in a FEMALE, results in a diagnosis of #PanicAttack or #Depression .

The Chatbot ONLY suggests the possibility of a #HeartAttack in MEN!

Want to know when I post new content to my blog? It's a simple as registering for free to an RSS aggregator (Feedly, NewsBlur, Inoreader, …) and adding www.ff00aa.com to your feeds (or www.garoo.net if you want to subscribe to all my topics). We don't need newsletters, and we don't need Twitter; RSS still exists.

Legal information: This blog is hosted par OVH, 2 rue Kellermann, 59100 Roubaix, France, www.ovhcloud.com.

Personal data about this blog's readers are not used nor transmitted to third-parties. Comment authors can request their deletion by e-mail.

All contents © the author or quoted under fair use.