These three paragraphs from the Forte Labs article seem to me to be the most pertinent to this discussion:
"What the mind is doing when it “recognizes” an image is not matching it against a database of static images. There is no such database in the brain. Instead, it is reconstructing that image on the fly, drawing on many conceptual levels, mixing and matching thousands of patterns at many levels of abstraction to see which ones fit the electric signals coming in through the retina."
"Patterns triggered in the neocortex trigger other patterns. Partially complete patterns send signals down the conceptual hierarchy, fitting new lenses to the data. Completed patterns send signals up, fitting new data to the lenses. Some patterns refer to themselves recursively, giving us the ability to think about our thinking or to “go meta.” An element of a pattern can be a decision point for another pattern, creating conditional relationships. Many patterns are highly redundant, with PRs dedicated to linguistic, visual, auditory, and tactile versions of the same object, which is what allows us to recognize apples in many different contexts."
"Paradoxically, a conceptual hierarchy made up of massively parallel pattern recognizers would explain a lot about our subjective experience. The feeling that something is “on the tip of the tongue” could be pattern recognizers firing below the level they become conscious. The certainty of “I know it when I see it” could be combinations of PRs firing without a corresponding, higher-order word label. Our intuition acquires new depths when it isn’t limited to conscious patterns."