Crisis Text Line stops sharing conversation data with AI company

A privacy group said its statement calling CTL a ‘model steward of personal data’ was taken out of context

Crisis Text Line stops sharing conversation data with AI company0

Crisis Text Line has decided to stop sharing conversation data with spun-off AI company Loris.ai after facing scrutiny from data privacy experts. “During these past days, we have listened closely to our community’s concerns,” the hotline service writes in a statement on its website. “We hear you. Crisis Text Line has had an open and public relationship with Loris AI. We understand that you don’t want Crisis Text Line to share any data with Loris, even though the data is handled securely, anonymized and scrubbed of personally identifiable information.” Loris.ai will delete any data it has received from Crisis Text Line.

Politico recently reported how Crisis Text Line (which is not affiliated with the National Suicide Prevention Lifeline) is sharing data from conversations with Loris.ai, which builds AI systems designed to increase empathetic conversation by customer service reps. Crisis Text Line is a not-for-profit service that provides a text line for mental health discussions, but it is also a shareholder in Loris.ai and, according to Politico, at one point shared a CEO with the company.

“No data scrubbing technique or statement in a terms of service can resolve that ethical violation”

Before hotline users speak with volunteer counselors, they consent to data collection and can read the company’s data-sharing practices. Politico quoted one volunteer who claimed that the people who contact the line “have an expectation that the conversation is between just the two people that are talking” and said he was fired in August after raising concerns about CTL’s handling of data. Since then, that same employee started a petition pushing CTL “to reform its data ethics.”

Politico noted how Crisis Text Line says data use and AI play a role in how it operates:

Following the report, Crisis Text Line released a statement on its website and via a Twitter thread. In a statement, Crisis Text Line said it does not “sell or share personally identifiable data with any organization or company.” It went on to claim that “[t]he only for-profit partner that we have shared fully scrubbed and anonymized data with is Loris.ai. We founded Loris.ai to leverage the lessons learned from operating our service to make customer support more human and empathetic. Loris.ai is a for-profit company that helps other for-profit companies employ de-escalation techniques in some of their most notoriously stressful and painful moments between customer service representatives and customers.”

In its defense, Crisis Text Line said “Our data scrubbing process has been substantiated by independent privacy watchdogs such as the Electronic Privacy Information Center, which called Crisis Text Line “a model steward of personal data.” It was citing a 2018 letter to the FCC, however, that defense is shakier now that the Electronic Privacy Information Center (EPIC) has responded with its own statement saying the quote was used outside of its original context:

On the Loris.ai website, it claims “safeguarding personal data is at the heart of everything we do,” and that “we draw our insights from anonymized, aggregated data that have been scrubbed of Personally Identifiable Information (PII).” That’s not enough for EPIC, which makes the point that Loris and CTL are seeking to “extract commercial value out of the most sensitive, intimate, and vulnerable moments in the lives (of) those individuals seeking mental health assistance and of the hard-working volunteer responders… No data scrubbing technique or statement in a terms of service can resolve that ethical violation.”

Update, 10.15PM ET: This story has been updated to reflect Crisis Text Line’s decision to stop sharing data with Loris.ai.

Kupon4U.com
Logo
Enable registration in settings - general