Poorly Thoughtout Life

Explanation comes second

A recent study on LLMs pointed out: chain-of-thought can reduce performance on tasks where thinking makes human worse. In other words, LLMs do better if they were asked only the answer but falter when asked the reasoning behind.

This is not surprising since reasoning is not inherent to the systems (not part of the design) but a proxy built over their complex black-box architecture. That’s not to say chain of thought never helps. It shines in well-defined or rule-based tasks.

Interestingly, the same idea applies to the human brain—a far more complex system we barely understand. We like to believe ourselves as rational, logical beings but it’s beyond our capacity to interpret and explain everything that our brain concludes or makes us feel.

How hard it is to explain why we like one thing over the other? Why does one like ice cream? Why does one like going to the beaches? Why do we look at sunsets? Our brain, like LLM doesn’t work in a step-by-step manner, like a mathematical proof. Our thoughts often arrive as answers first, with explanations trailing behind.

At a higher level, we are bounded by the expressiveness of language to capture the nuances of what we are feeling. The struggle is seen when we verbalise how we feel. What do you feel when you listen to photograph or the night we met? What do you feel when a wave splashes your feet on the beach? What do you feel when you see a falling star? “Calm”, “peaceful”, “serene”, “whole” are some words that fall short of capturing the depth of the feelings.

Ask someone what they search for in a partner and they’ll give you a checklist. And yet they end up with someone showing no resemblance to the checklist. It’s hard for anyone to explain that connection. We've invented terms like "vibe" to describe something our brain understands intuitively but can’t articulate fully.

When it comes to making decisions, we resort to calling it “gut” feeling. It’s your brain telling you the answer but unable to generate an explanation. And that’s not a fault in this system. An explanation should not be a prerequisite to a decision. If you require an explaining before you make a decision, best case you’ll waste time and worst case you’ll remain stuck. The need to justify choices—especially to friends or family—can lead us to erase good options simply because they don’t "make sense."

Not all decisions are explainable, and many of the most important ones aren’t. Trust that your brain, like an LLM, is working in your favor, even when it doesn’t offer reasons you can articulate. Explanation, after all, comes second.