Saturday, November 26, 2016

I’ve Been Replaced!


Even doing a most Human Task
For that last fifteen years I‘ve been waiting for this to happen. I could see it gaining on me, but following Satchel Paige’s advice I din’t want to turn around. I began to suspect this was coming when the fees that insurance companies paid us for psychotherapy began to drop.. My colleagues thought it was due to the cheaper costs of people with Masters’ degrees, but I what knew better: chat bots and A.I.
In 1966, before there was a word for “chat-bots,” Joseph Weizenbaum at the MIT Artificial Intelligence Laboratory created Eliza, a computer program that was designed to respond to typed questions with simple responses, mostly repeating what the typer said, or saying “tell me more.” You can talk to a slightly improved Eliza on this web-site: http://www.manifestation.com/neurotoys/eliza.php3
I’ve been a psychotherapist for over forty years. I can look at people and tell when the words they are saying are covering up what they are feeling. I’ve learned how to gently push people to look deeper into themselves and open up.
There were many times when I sat with couples who were trying to sort out the difficulties in their relationship. Often one of them had been caught acting like some of our past or future Presidents, which the other partner found upsetting. The guilty party would be very apologetic and swear on their children’s heads that it would never happen again.
After years of witnessing these acts of contrition I can accurately determine whom to believe and who was still attempting to deceive. I had developed two methods which aided my intuition. The first was that I became a close observer of faces. There are very subtle differences in the coordination of the muscles between the eyes and the mouth. I’m not sure if I could tell from a picture. I think it was more about how they move, or don’t move, together.
I also had the advantage of working in a small city in which everyone knew everyone else’s business I often had either the third member of the love triangle, or her roommate, or his sister, or someone else who was bothered by what was going on, who was also seeing me for psychotherapy. They had no idea that I was seeing this couple, but it bothered them enough to bring it up in their own sessions.
However, those years of experience; that unique talent, has now been learned by a tv camera that is connected to a computer that runs a “deep learning” artificial intelligence program. I saw a demonstration of one of them two months ago at The Harvard Innovation Lab during Hub Week in Boston. The program, developed by a company called Affectiva,, would focus on a person’s face, and determine their emotional state. According to their marketing materials it has developed “norms built on the world’s largest emotional database of more than 25,000 media units and 4 million faces analyzed in 75 countries.” They also state that “we are humanizing technology.”
I my view they are taking a big step towards dehumanizing psychotherapy.
There is a lot of research that shows that we humans are social creatures. We have evolved to care about each other. Those of us who have better social relationships, especially close ones with partners, family and friends, do better in life. Those people live longer, happier, healthier lives.
One of the major healing factors of psychotherapy as been that the patient feels a connection to an understanding, caring, non-judgmental person, especially one who has some knowledge and status, like someone you call “doctor.” Would it be possible to “transfer” such feelings of trust and hope onto a machine?
Apparently, it is. In fact, there are studies that have been done with war veterans who have stated a preference to talk to a responsive A. I. program. When asked why, many of the veterans explained that when they would recount some really gruesome battle field experience, such as having to shoot an eight year-old girl in the head because she was carrying a bomb, they could feel the reaction from any human counselor, even another vet. But a machine could be empathic and still not be upset.
Also, during the years of Facebook. Twitter, Snapchat and even Medium, people have been conditioned to seek “likes.” Those little electronic bits of encouragement evoke the positive feelings of accomplishment, hope and happiness in response to any kind of feedback on social media. It keeps people coming back. It keeps people wanting more. Each little thumbs-up, heart or smile, makes us feel a bit more connected, clever and important.
During the last ten years there have been over a dozen companies that are using artificial intelligence and deep learning to not only monitor human emotion, they are building machines that can respond in an artificial empathic manner, reflecting the emotion of the real human. In addition to responding to facial features these programs can respond to the tone of voice and the content of the conversation. There is a program that goes deeper than the facial expressions and monitors the blood flow in the muscles of the face. They feel that this can reveal when someone is trying to hide there emotions. There are also new ways of tracking other physiological components of emotions, including heart rate, neurological responses, stress and arousal levels, and other physical activities.

Could this be psychotherapeutic?

The technology is probably available today for you to sit in front of a screen, and by talking to a deep learning machine, you could diminish your emotional difficulties and improve your life. The machine could watch and listen to you, monitor the content of what you say and determine the emotions that you are feeling. It could probably tell if you are angry with your mother, have suspicions about you spouse, are exaggerating your accomplishments or are lying about your drug use. It could offer soothing, compassionate responses, as well as questions that would push you to explore the origins and complications of your difficulties. It could point out your errors in judgement without insulting you, and probably help you sort through ideas and find better solutions.
What is new about this is that the more these deep learning machines “practice,” just like a good clinician, they get better. They would learn more about your emotional patterns and your responses. You could report back and tell them what you felt was successful and helpful. In addition, they could also get information about your physical conditions to see if your blood pressure was staying under control, or your cortisol levels were diminished. It could know if you sleep better after your sessions, and if you are generally happier and more productive. It wouldn’t have to depend on your verbal reports during the first fifteen minutes of the session. If necessary you could have a breathalyzer app added to the protocol.
Another big advantage is that your virtual, deep learning therapist would always be available, and the sessions could be as long or short as you felt necessary. You wouldn’t feel you were intruding on your therapist over the weekend. The therapist would never be tired, preoccupied by the troubles in his own life, or worrying about the fees. It would come with graphs and analytics that would help you monitor your own progress, and it could give you smiley faces and heart emogis whenever you did well.
Fantastic! 😀
Fantastic? 😟
Are there dangers with this? Sure are! Many.
So much depends upon the values and philosophies of the programmer. What gets reinforced? Does it just allow you to go down your own crazy path? Does it have a goal of making you a better cog in a conformist society? Those things are not yet clear.
The big question is: what about people? Will some people become much more comfortable with a compassionate machine and still find ‘live” people to anxiety provoking, too demanding and too annoying?
Life can be tough. Each of us “live” people have our own agenda. We want something from you, just as you want from us. That’s what relationships are. That’s why they work. That’s why they don’t work. If the machine annoys you you can unplug it. That’s not good to do to a partner.
Remember, technology is a tool. Many new innovations are offered to us every day to help us live better, easier, happier, healthier, more efficient, creative, productive and satisfying lives.
But what all those words really mean is up to each of us, individually and as a society.
Choose wisely my friends.

1 comment:

Forsythia said...

I hope there will always be room for both kinds of therapists. Suppose someone is thinking about jumping off a bridge. Can't you just see the cops approaching the distraught individual with a digital device programmed to talk him out of it?