We are increasingly turning to chatbots on smart speakers or websites and apps to answer questions. And as these systems, powered by artificial intelligence (AI) software, become ever more sophisticated, they are starting to provide pretty decent, detailed answers.
目前,使用智能扬声器或网站和软件上的聊天机器人来回答问题变得越来越常见,随着这些AI软件驱动的系统变得越来越复杂,它们对于问题的解答也越来越详尽可靠。
But will such chatbots ever be human-like enough to become effective therapists?
Computer programmer Eugenia Kuyda is the founder of Replika, a US chatbot app that says it offers users an "AI companion who cares, always here to listen and talk, always on your side". Launched in 2017, it now has more than two million active users. Each has a chatbot or "replika" unique to them, as the AI learns from their conversations. Users can also design their own cartoon avatar for their chatbot.
程序员尤金妮亚·库伊达是美国聊天机器人应用Replika的创始人,该应用主要为用户提供了一个“懂关心、会倾听、善交谈且支持你的AI伴侣”。该应用于2017年推出,目前拥有200多万活跃用户。人工智能会从与用户的对话中学习,因此每个人都有一个独特的聊天机器人或“replika”。用户还可为自己的聊天机器人设计专属的卡通头像。
Ms Kuyda says that people using the app range from autistic children who turn to it as a way to "warm up before human interactions", to adults who are simply lonely and need a friend. Others are said to use Replika to practise for job interviews, to talk about politics, or even as a marriage counsellor.
库伊达表示,Replika的用户既有自闭症儿童也有孤独无友的成年人,自闭症用户将该应用作为“与人互动之前的一种热身运动”,据说还有一些人会用Replika进行求职面试、政治探讨甚至担任婚姻顾问。
And while the app is designed primarily to be a friend or companion, it also claims it can help benefit your mental health, such as by enabling users to "build better habits and reduce anxiety".
Around the world there are almost one billion people with a mental disorder, according to the World Health Organization (WHO). That is more than one person out of every 10. The WHO adds that "just a small fraction of people in need have access to effective, affordable and quality mental health care".
据世界卫生组织统计,全世界有近10亿人患有精神障碍。这也就表示,精神障碍患病率超过了十分之一。世界卫生组织补充说,“只有一小部分有需要的人能够获得有效、可负担和高质量的心理健康护理”。
Dr Paul Marsden, a member of the British Psychological Society, says apps that aim to improve your mental wellbeing can help, but only if you find the right one, and then only in a limited way. "When I looked, there were 300 apps just for anxiety... so how are you supposed to know which one to use? "They should only be seen as a supplement to in-person therapy. The consensus is that apps don't replace human therapy."
英国心理学会成员保罗·马斯顿博士表示,改善心理健康的应用可以有所帮助,但前提是你找到了合适的应用程序,而且只能以有限的方式。“打开手机一看,有300个缓解焦虑的应用……那么你怎么知道该用哪一个呢?“这些应用只能充当亲身治疗的辅助,人们一致认为,应用程序不能取代人力治疗。”
Yet at the same time, Dr Marsden says he is excited about the power of AI to make therapeutic chatbots more effective. "Mental health support is based on talking therapy, and talking is what chatbots do," he says.
但与此同时,马斯顿博士表示,他对AI使治疗聊天机器人变得更有效感到激动兴奋。他说:“心理健康支持是建立在谈话疗法的基础上的,而聊天机器人正是如此。”。
But what if a person's relationship with their chatbot therapist becomes unhealthy? Replika made headlines in February when it was revealed that some users had been having explicit conversations with their chatbot. The news stories appeared after Luka, the firm behind Replika, updated its AI system to prevent such sexual exchanges.
但如果一个人与AI治疗师的关系变得不健康怎么办?Replika在2月份成为头条新闻,当时有消息称一些用户与他们的聊天机器人聊了一些露骨的内容。就在此前,Replika背后的公司Luka更新了其人工智能系统以防止此类性交流,但这样的新闻还是出现了。
UK online privacy campaigner Jen Persson says there needs to be more global regulation of chatbot therapists. "AI companies that make product claims about identifying or supporting mental health, or that are designed to influence your emotional state, or mental well-being, should be classified as health products, and subject to quality and safety standards accordingly," she says.
英国网络隐私活动家杰·彭森表示,各国需要对AI治疗师进行更多的监管。她说:“对检测或支持心理健康的产品,或者影响用户情绪状态或心理健康的产品,人工智能公司需要做出产品声明,这些产品应该被归类为健康产品,并遵守相关的质量和安全标准。”。
词汇小结:
chatbot:聊天程序;聊天机器人
sophisticated:复杂巧妙的,精密的;;水平高的
therapist:治疗专家;心理治疗师
programmer:程序设计员;编程器
companion:同伴,伴侣
avatar:头像,图标
autistic:患孤独症的,患自闭症的
explicit:明确的,详述的;露骨的
编辑:ETTBL
翻译:Gleen
材料来源:BBC
*配图取自网络,仅供学习分享使用,侵删