Decades ago, one of the early attractions of simulated artificial intelligence was Eliza. Eliza was a pretend therapist. Now, Woebot comes along, and it’s much the same concept. It tries to determine your issues, and serve as therapy, based on far more responses than Eliza ever managed. I’m not thrilled with this. Even if it’s reasoning is more sound, it’s programming more solid, and it’s manners even better. It can only work with what we, as humans provide it, and when we do that, we tend to try to break these things. Worse yet, is that we don’t know who will be watching what you type in.
Woebot’s use of cognitive behavioral therapy has a philosophical and practical logic to it. Unlike forms of psychotherapy that probe the root causes of psychological problems, often going back to childhood, C.B.T. seeks to help people identify their distorted ways of thinking and understand how that affects their behavior in negative ways. By changing these self-defeating patterns, therapists hope to improve symptoms of depression and anxiety.