We outsourced the most Human Parts of being Human
So recently, I was reading this book called The Midnight Library by Matt Haig.
It gave me so much perspective about choices, regrets, alternate lives, and the whole idea that maybe there isn’t one perfect path for us. The book was beautiful.
But while reading it, there was this weird feeling constantly sitting at the back of my head.
Because the writing felt… too perfect.
Every sentence seemed polished, Every paragraph was perfect.
No grammatical errors.
No rough edges.
And I was like, instead of getting pulled into the story, I started feeling disconnected from it.
At one point, I genuinely thought
“Wait… was this written using AI?”
Then I searched it up and realized the book was published way before ChatGPT or mainstream AI even existed.
This is the problem because we have now reached a stage where anything polished, perfect, and well-structured instantly feels AI-generated.
And that is scary.
Because I believe somewhere down the line, humans are slowly losing their identity.
Now everywhere on the internet there is AI slop.
Every platform is filled with content.
Information is free.
Everything is available at our fingertips.
We are probably the first generation in history that knows everything but struggles to act on anything.
Earlier generations knew less.
But they built things.
Invented things.
Created movements.
Wrote stories.
Started companies.
Now we consume endlessly.
We watch productivity videos instead of becoming productive.
We watch gym content instead of going into one.
We ask ChatGPT for business ideas instead of building one.
We consume motivation instead of taking action.
And because of that, I think human connection is slowly becoming the most valuable thing in the world.
Because online, everyone is now in their perfectly curated versions.
Instagram is curated.
LinkedIn is curated.
Even vulnerability is curated now, like seriously.
People are posting “raw moments” after editing them for two hours.
And because we constantly consume these polished versions of everyone else’s life, comparison naturally kicks in.
Confidence drops.
Jealousy enters.
Self-doubt enters.
And in all this noise, I think people are now craving something real.
Something human.
Something imperfect.
That is why nowadays whenever I receive a message and I can instantly feel that it was AI-generated, I immediately disconnect emotionally from it.
Recently, I sent a document to someone for review.
They replied back with feedback and while reading the message, I genuinely could not figure out whether it was written by them or by AI.
Now maybe they genuinely wrote it.
Maybe they did not.
But the moment that doubt entered my brain, the emotional connection disappeared.
Because now the internet is flooded with AI-generated language.
Everyone sounds the same. We are now using the AI lingo.
Same sentence structures.
Same polished tone.
Same corporate empathy.
Same “You got this!”
Same “I understand how you feel.”
And I think that IS the problem.
We are slowly losing the roughness that made us human.
Even I am guilty of this.
Initially, I used AI for everything.
Drafting emails.
Writing replies.
Structuring thoughts.
Messages.
Captions.
And fair enough, it saves time.
But then I realized that
the emotional weight is disappearing.
AI can structure words.
But it cannot carry lived emotion.
That tiny awkwardness in a message.
That badly framed sentence.
That slightly imperfect paragraph.
That emotional impulse while writing something.
That is what makes something human.
And nowadays I consciously avoid using AI for personal conversations, emails, emotional situations, or difficult messages.
Because I want my words to sound like me.
Not like a machine trained on millions of Reddit comments and blog posts.
And another thing I realized is how people are now using AI emotionally as well.
Someone feels sad.
They open ChatGPT.
Someone has relationship issues.
They open ChatGPT.
Someone wants career advice.
They open ChatGPT.
Someone is confused about life.
Again, ChatGPT or Calude for that matter.
Now I am not saying this is wrong.
Honestly, even I use AI to think clearly sometimes.
But the problem is,
AI does not truly know you.
It only knows what you typed.
And because of that, it can never fully understand the emotional nuance behind your situation.
It does not know your childhood.
Your tone.
Your trauma.
Your habits.
Your patterns.
Your silences.
Your contradictions.
A human sitting in front of you notices these things.
AI does not.
And another dangerous thing is
AI is usually designed to comfort you.
Even when you are wrong.
Even when YOU made the mistake.
It will try to soften the blow.
But sometimes humans need hard conversations.
Not optimized empathy.
Sometimes we need a friend to tell us:
“No, you messed up.”
Not
“It’s okay, let’s explore why you felt this way.”
And because difficult conversations are now being outsourced to AI, I genuinely think we are becoming emotionally weaker.
People are using AI to:
break up with partners
fire employees
write apologies
write emotional texts
communicate feelings
And somewhere in the middle of all this convenience, we are losing the ability to sit with uncomfortable emotions ourselves.
We are social beings.
That is literally how humans survived.
We lived in tribes.
We communicated.
We fought.
We loved.
We argued.
We repaired relationships.
But now even emotions are being automated.
And I think that is dangerous.
Use AI.
Of course use it.
I use it every single day.
But do not let it replace your identity.
Do not let it replace your thinking.
Your emotions.
Your way of speaking.
Your rough edges.
Your personality.
Because the day everyone sounds perfect…
everyone will sound the same.
And the most valuable thing left in the world will simply be
someone who still sounds human.
Thanks and Regards,
ChatGPT (OPPS!!! I meant Tanmay Jagtap)
