Comments

  • Artificial Emotion: The ethics of AI therapy chatbots expressing sympathy & empathy.
    I completely agree.

    Do you think there's a difference in that at least humans are capable of feeling emotion? Whereas the technology has been programmed to 'fake' emotion one hundred percent of the time it expresses emotion?
  • Artificial Emotion: The ethics of AI therapy chatbots expressing sympathy & empathy.
    It really depends on the form of emotional support the person needs. Although Natural Language Processing has improved, chatbots are not capable of having anywhere near as deep a conversation as humans are (& likely never will be able to).

    Empirical research on the efficacy of mental health chatbots is still very thin. There are a few studies which show they can be effective means of support, but this is compared to no support/other forms of computerised support.

    Having said that, people can feel more comfortable talking to technology, perceiving there to be a lack of judgement. This could apply to groups who have increased stigma placed upon them & find it hard to confide in others about their emotional state. Woebot claim that they receive feedback from people that "talking to Woebot makes it easier to talk to their partner or a therapist”.

    If it's just providing a safe space to talk & reflect on your feelings then I think they can be effective to some degree.