[Epistemic status: This is my working model. I think something like this is probably happening irl. Some of my details of neurology, anatomy, and evolutionary biology are probably wrong. I'd be only slightly surprised if I converted from HOPs theory to some sort of HOTs theory in the next year, but I don't think that would have strong practical implications.]
Trigger-action plans exist on a spectrum. Over on the left, you have TAPs like "If I enter my house through my front door, I'll put my keys in the box on the side table." On the right you have TAPs like "If I'm confused, I'll stop and compare what I expected to happen to what happened instead."
keys <-----------------> confusion
Roughly speaking, the stuff on the left is physical, and the stuff on the right is cognitive.
The stuff on the right seems to be harder. Why is that? This post is about my attempt to answer that question.
How do you know when you've just opened your front door? You saw the door in front of you, felt the knob turn in your hand, heard a creaking sound as it opened, and now you see a hole where the door used to be.
How do you know when you've just felt confusion? In my case, I'd know because I'd have noticed feeling a sudden burst of surprise followed by a lack of resolution that's now developed into a hanging that's-not-rightness.
But I know that because I spent a long time studying my own reactions to confusing situations. I attended strategically to confusion. If you asked me five years ago how I know when I'm confused, I might have said, "Well, I just... know, you know?"
And if you'd asked me five years ago, I'd have been wrong. The truth would have been, "I usually don't know when I'm confused."
I think of human introspection as analogous to the parietal eyes of lizards. Lizards (and some other animals) have a light sensor atop their heads that can't detect anything more specific than the presence or absence of light.
If you took away a lizard's true eyes and left it with just the primitive third eye, it would have something almost but not quite entirely unlike vision. It could distinguish night from day, but certainly not knights from daisies. In other words, it would be about as blind as its distant ancestors who had just begun to develop sight. Lizard-relevant parts of the world would be way more complicated than its vision could handle.
My best guess about why introspection is harder than outrospection is this: We're in an awkward evolutionary stage where the human-relevant goings-on inside our brains are way more complicated than our shiny new prefrontal cortices can handle.
We have an organ that lets us perceive high-order cognitive algorithms like "my inferences from what my model of Karen's brain predicts I will say", or "the thing happening in my auditory cortex when I hear E above middle C".* But we still have the primitive version of the organ. We've not yet evolved true introspection. So we can perceive our thoughts and feelings, maybe for the first time in evolutionary history, but our perception tends to be vague, fuzzy, and weak. Night and day, not knights and daisies.
But there's a funny thing about perception of cognitive algorithms.
Imagine you're playing Where's Waldo...
...but instead of carefully scanning through the chaos, you can turn everything without red stripes into a perfectly blank white background. Suddenly, the game wouldn't push your visual processing to its limits. Finding Waldo would be easy.
You can't change a physical image just by thinking about it - but you can change your cognitive algorithms by thinking. That's what thinking is.
So introspection is hard because our PFC is primitive, but there are still things we can do to make it easier. If I want to train a thoroughly cognitive trigger-action plan, my strategy should make it as easy as possible on my primitive PFC.
The art of streamlining thought for successful perception seems to consist of strategic use of attention, as far as I can tell. Attending in ways that make the most of a human PFC will be the subject of an upcoming post.
*Considering introspection to be a "sense" is a minority position among philosophers of mind (I think?). I recommend the SEP article on higher-order consciousness theories if you're curious about other perspectives.