Yeah, those that know me, know that I'm not the biggest fan of Skynet. There are definitely some things in the AI world that are making people defer to it, rather that traditional methods of getting help. However, to me, this seems much like blaming the screwdriver manufacturer for injury because the person stuck the tool in his eye, blaming the fork for obesity, and so forth.
There is a societal epidemic with regards to mental health. That much is so true these days. And while I don't want to get into a long dialogue about mental health, I would often wonder, where was the person's family - since now, they are seeking damages in what they are trying to make as a wrongful death suit. How concerned were they about their own family member who may have been exhibiting signs of paranoid delusion?
One might argue the topics of explainable vs. non-explainable models with respect to AI, but this article seemed to omit any details about Soelberg's (and family)'s responsibility in all this. Is there personal responsibility to be had with respect to human interaction with AI - or is this another example of the woman blaming McDonalds for burns to her lap from spilling hot coffee on it, just because the cup didn't say 'hot'?