A community to discuss AI, SaaS, GPTs, and more.

Welcome to AI Forums – the premier online community for AI enthusiasts! Explore discussions on AI tools, ChatGPT, GPTs, and AI in entrepreneurship. Connect, share insights, and stay updated with the latest in AI technology.


Join the Community (it's FREE)!

Ai for personal counselling

Member
Messages
42
I am a counselor by profession. And I keep meeting people who need counseling because of digital addiction. They live in a virtual world and start loosing connect with the real world and develop mental and emotional conditions that seek real human interactions.
So, the advancement of technology has led to increase in mental and emotional sickness. It would be irony to develop and use technology to solve the problem that was created by it.
They say it has promising future. Only time will tell
 
Member
Messages
30
AI doesn't have the capacity to understand human emotions and would not be a great tool for personal counselling. AI deals with data and not sentiments which are vital aspects of counselling a human. AI has zero empathy.
 

Jay

Member
Messages
33
AI isn't real. It's the same thing I see with forum posters that are AI. Who cares what jokes they make or whatever, cause it isn't real. As I probably mentioned before, the only use I see is for some fantasy situation like a fantasy forum.
 
Member
Messages
42
AI has no emotions. In counseling sessions, most people want to come and talk their heart out without having the fear of being judged. The foremost quality required for a counselor is to be a good listener without judging. And counselors are often suggested to not get emotional as it would impact them as a professional as well as their mental health too. So, being a technology, AI would be away from these and can be used.
But then there are people who seek human interactions and emotional acknowledgement in their counseling session. AI is not able to do that.
A human counselor will be able to gaze a lot of things through body language and other stuffs. The counseling sessions can be catered according to the need of the seeker. Definitely AI is still not ready enough to take that decision.
 
Member
Messages
66
AI doesn't have the capacity to understand human emotions and would not be a great tool for personal counselling. AI deals with data and not sentiments which are vital aspects of counselling a human. AI has zero empathy.
Human is sensible enough to understand human emotions and feelings and respond accordingly, on the other hand, ai does not achieve this. Human needs connectivity to express their views and feelings which an ai can't do this.
 
New member
Messages
14
I am a counselor by profession. And I keep meeting people who need counseling because of digital addiction. They live in a virtual world and start loosing connect with the real world and develop mental and emotional conditions that seek real human interactions.
So, the advancement of technology has led to increase in mental and emotional sickness. It would be irony to develop and use technology to solve the problem that was created by it.
They say it has promising future. Only time will tell
Interesting points. I wonder if it is fair to say that technology has led to increased mental and emotional sickness. I suggest that this tendency or the sickness was already there - e.g. suppressed thoughts. emotions, behaviours. Technology and culture more generally create a canvas upon which we can case our shadow - and then learn more about ourselves and heal. Well, ideally.

Perhaps not, just thinking out loud.
 
New member
Messages
14
AI doesn't have the capacity to understand human emotions and would not be a great tool for personal counselling. AI deals with data and not sentiments which are vital aspects of counselling a human. AI has zero empathy.
Fair points and gets to a core function of counselling. However, there are also other functions served by counsellors that could be performed by AI and many more effectively. For example, an informed AI system could build a detailed profile of an individual and the surrounding environment and society, and then create custom opportunities for the individual to connect with others, exercise, to support them to learn something at which they will excel. An AI system could probe with Socratic questions to help the individual better understand themselves. The AI system would be available any time of day. Sessions would not be a ridged hour in duration etc.

Well maybe. I think a central question is trust. Could you trust any counselling AI system with deep personal information, information that could be used against you. Allowing yourselves to be vulnerable is oft critical to psychological healing I would argue.

Interestingly Socrates refused to write anything down because he felt this diminished communication. True discourse could only occur via face-to-face interactions.
 
New member
Messages
14
AI isn't real. It's the same thing I see with forum posters that are AI. Who cares what jokes they make or whatever, cause it isn't real. As I probably mentioned before, the only use I see is for some fantasy situation like a fantasy forum.
Don't get it. What do you mean it is not real? Do you mean it isn't intelligence with agency?
 
New member
Messages
14
AI has no emotions. In counseling sessions, most people want to come and talk their heart out without having the fear of being judged. The foremost quality required for a counselor is to be a good listener without judging. And counselors are often suggested to not get emotional as it would impact them as a professional as well as their mental health too. So, being a technology, AI would be away from these and can be used.
But then there are people who seek human interactions and emotional acknowledgement in their counseling session. AI is not able to do that.
A human counselor will be able to gaze a lot of things through body language and other stuffs. The counseling sessions can be catered according to the need of the seeker. Definitely AI is still not ready enough to take that decision.
I agree AI is not ready. Also, human oversight would always be required.
I agree that having another conscious being see and acknowledge a client's pain is what some clients are seeking/need so they can heal.

I think trusted AI systems could be improved and serve as counselling tools. Many clients may not need the human connection to help them progress on some issues, depending on the issue. Would you agree with this?

I think AI systems could also be effective at targeting systemic issues. Often clients are suffering because of the environment they are in. Having someone understand your difficult situation can help but the solution is improving the environment. Counsellers typically have little power to do this. A sufficiently powerful AI system integrated into the fabric of society perhaps could.
 
Active member
Messages
211
I was reading a research recently about how AI is used to access the human brain. The one thing which I know that is possible when it comes to what AI can be used for is how the technology have been programmed. Maybe right now, AI can't counsel human beings but in the future, with the coding upgrade it will receive and data available to it, I'm sure Al can perform this task.
 
Top