A new report accuses Talkspace of launching mobile therapy that harnesses data from private therapeutic chats of clients. If true, the accusation raises serious ethical questions about the technology company’s respect for patient rights and the company’s understanding of the strict ethical rules governing confidentiality. patient-client confidentiality.
Former employees and therapists at Talkspace told The New York Times that anonymous chats between healthcare professionals and their clients are reviewed by the company on a regular basis so that they can exploit information. Because text conversations are considered medical records, users cannot delete transcripts. A therapist stated that, when she referred clients to resources outside of the Talkspace app, a company representative told her she should advise the client to continue using Talkspace ̵1; even though she said she hasn’t disclosed that conversation to anyone at the company. The company contends that the conversation may have been flagged due to algorithmic review, according to The Times.
A pair of former employees suggested that Talkspace data scientists looked at customers’ transcripts so they could find common phrases and exploit them to better target potential customers. Talkspace denies that they mine data for marketing purposes. Similarly, some therapists told The Times that Talkspace seems to know when clients work for “business partners” like JetBlue, Google and Kroger and will pay special attention to them. One therapist claimed that, when they thought she was taking too long to respond to two customers from Google, the company contacted her and expressed concern. The Times also reported that the company was working on the bot to catch signs that a client might be in trouble, ostensibly to help therapists catch the reds they might miss. .
A Talkspace spokesperson denied the allegations. “We take special care of all our corporate partners and their employees as well as we do with each consumer,” John Reilly, general counsel at Talkspace, told Salon via email. “The main difference is that introducing a large corporate account is a bit more complicated than exactly matching one person, so we have wide deployment protocols and deployment managers for each. Large business customers to ensure a smooth transition at the beginning of every relationship. “
As for the bots, Reilly told Salon that “we provide our network of therapists with an array of analytical tools for their digital activity. A program will look at encrypted text.” to alert a language therapist that might indicate a client has a prominent problem or language escalation. “
The Times highlighted the story of a man named Ricardo Lori, who was recruited into the company’s customer service department after being an avid user for many years. When an executive asked him to read the therapeutic chat log excerpt in front of employees to get them a better impression of the user experience and ensure that he would remain anonymous, he agreed. However, after the presentation, the Times reported that Lori’s confidence was betrayed.
As Mr. Lori drank a tall glass of red wine and watched, he noticed that a few employees were constantly glancing at him. Then a member of the marketing department approached and asked if he was okay. Then Oren Frank, her husband Frank and chief executive officer, thanked him in the elevator. Somehow, it was rumored that Mr. Lori was a client in the reenactment.
In response to these allegations, Talkspace wrote in a Medium post that many of their responses to the Times interview questions were not included in the final story and that a prominent clinician. Talkspace support did not include his answer in the story either.
Despite the evidence and sources from the company, Reilly denies that the transcripts exploited data on Talkspace for marketing, claiming that only the data filtered the user-identifiable information. Newly used for quality control.
The Health Insurance Accountability and Information Act (HIPAA) privacy rule sets out strict rules for health care providers when sharing information. of the patient. It specifies that healthcare practitioners may only share medical information with each other and is only for the purpose of providing adequate medical care to their patients; that they are not authorized to disclose personal medical information to the public; that the patient has the right to see and if necessary correct their records; and personal medical information is not disclosed so that providers can improve their marketing.
Hayley Tsukayama, Legislative Activist from the Electronic Frontier Foundation, told Salon via email: “If it is true that Talkspace used information from private therapy sessions for marketing purposes then it is a clear violation. trust in their customers. “All companies must be very clear with their customers about how they use their personal information, ensure that they are not using information in a way that consumers did not expect, and give them a chance to withdraw. Consent for those purposes is continuous Talkspace does business based on trustworthiness and regularly mentions privacy in its advertising campaigns. its promise. “
Talkspace has been highly regarded in the past for its ethics, work practices and efficiency. In 2016, the company was also accused of therapists of using Talkspace service advertising scripts, lack of adequate plans for patients in danger, and monitoring conversations between therapists. and patient. (That story was originally reported by The Verge.) The company was also charged with threatening legal action against therapists trying to establish relationships with clients off-platform, something the Director executive Oren Frank said it only happened “in some extremely unusual cases” because therapists allegedly engaged in “serious ethical violations or potentially dangerous communication.”
Like other online therapy apps, Talkspace is not covered by most health plans and typically pays hundreds of out-of-pocket dollars for a regular subscription. Therapists who spoke to Salon about online therapy apps like Talkspace last year said they pay their therapists low and hide actual salaries. That reflects what many contract workers have seen with random labor-based companies similar to Uber, DoorDash, and Lyft: employers obscure actual wages to make decent work. Their companion seems more attractive.