I've had my moments of doubt when dealing with AI. While it may seem that every platform we work with is using (and pushing) the use of it, we should be thinking about how we as a field of practitioners of educators and clinicians can get out a unified message. We are not replaceable by AI.
Why? Let's look at some reasons.
First off and most obviously, AI relies on the Internet for all of its information. As we all know, much of the sex education information out there is in error, biased, poorly researched, or simply false. The only way for people to get reliable, accurate, and timely information is from you. If you have taken more than 20 hours of classes in human sexuality from reliable academic sources, you already know more than most of the world. When you complete a program like ISEE's sex therapy or sex educator certification, you are truly an expert. Dedicating approximately 200 hours to learning and supervision? You are an extremely rare commodity.
But there are other reasons. I was actually highly impressed with one program that kept pushing me to create a narrated presentation from a PowerPoint (based on a free trial). By the way, if you're doing this, please be careful. By uploading your presentation, you provide permission for AI to use your presentation in its learning, so you can be giving up the rights to your hard work. I uploaded a presentation that I don't need anymore.
When it was done (it took a while), I watched the presentation, listening to the narration by the AI voice. What was remarkable was that the information was summarized very neatly and well. It even took information from the web that was not part of the original presentation. What really blew my mind was that it pronounced my name right...both of them. This is extremely unusual, even for some people who are familiar with me or have known me awhile. The presentation was incredibly professional, accurate, and the voice was pleasant. I could find almost no flaws.
Except for this one: it was boring. The presentation had no flair. The program could not extrapolate any information it didn't have, could not tell a funny story, could not give a deeper answer based on wisdom or experience, could not truly respond in any meaningful way to a question. It could not theorize. It could only spit out ideas from the presentation itself or pull information from the web.
I have friends who have used ChatGPT as a proxy therapist. While their initial reactions were positive (it helped a friend of mine who had to put her cat down by comforting her with platitudes), ultimately she called me for the real thing: human comfort. A voice on the end of the phone. Real love and concern.
AI cannot replace what it feels like to be in a live or live-streaming class with real human beings who are in the process of learning; asking questions, exchanging ideas, laughing at each other's jokes, feeling compassion for each other's feelings.
I'm not suggesting that there hasn't been a sea change in how we interact, or that everyone will come surging back into our therapy offices or classes en masse. I am saying that AI is composed of you. People. Millions of voices who have put thoughts into the web's sphere.
So, we must keep thinking creatively. Keep curious. Stay connected to embodied reality, our minds and our spirits. As practitioners and as a field, we can be a force to encourage others to do the same.
