Comments

These findings have interesting
implications for reading remediation in individuals with phonologic
processing impairments because they suggest the possibility
that these individuals might benefit from visual word learning
strategies to circumvent the phonologic difficulties and directly
train holistic visual word representations in the VWFA[visual
word form area].

There are bouma disbelievers?

I think, on the contrary, some people actually overhype word shape without acknowledging the critique and expansions to those original proposed ideas (which by now date nearly 50 years back).

Aoccdrnig to rsceearch at Cmarbidge Uinervtisy,
it deosn't mttaer in waht oredr the ltteers in a wrod aer,
the olny ipmoratnt tinhg is taht the frist and lsat ltteer
be in the rghit pclae. The rset can be a ttaol mses and
you can sitll raed it wouthit a porlbem. Tihs is bcuseae
the hmuan mnid deos not raed ervey ltteer by istelf,
but the wrod as a wohle.

It's funny that we can still read that mess. And a bit puzzling too... how the hell do we do it?

I think it's a combination of two things:— At least in the fovea it's normal to parallel-process individual letters into words even if they're jumbled. Although this is not as quick as bouma reading it's still in the immersive layer (so we don't have to consciously try).— But also, a bouma being strongly defined by its silhouette, the first and last letters generally play a more significant role than the middle.

This might be because the latter's letter dislocations are too great (throwing off the parallel compilation) but it could instead/also be because the descender is moving too far, disrupting the bouma. Come to think of it, judicious dislocation of key glyphs (or even replacing certain ascending/descending letters with others) could be a key avenue for testing the [ir]relevance of boumas. Here it would be important to test the parafovea and not just the fovea, to mediate against the possible dominance of the parallel-letterwise layer.

Also note that especially in longer (typically compound) words there could be more than one bouma, or part of it could be a –more prominent– bouma; in "Cmarbidge" the [relative] intactness of "bridge" could be serving as a significant aid. My favorite example here is "readjust": because of their high frequencies and notable boumas, "read" and "just" pop out... and ruin the correct reading (which is why I prefer "re-adjust").

Interesting, I had not encountered actual numbers concerning that. Do you remember the source? And do you remember if it was concerning the fovea only or the entire field of vision?

I posit that the ratio would drop (perhaps even reversing) with the following: leveraging of the parafovea, reading experience, and good typography. For example if you're flashing Avant Garde point blank in front of high-schoolers, boumas don't have a chance.

Personally I was amazed to find that for example neural network theory has first been applied to the reading process some 35 years ago, and yet many practitioners of type design are still wondering if it is feature recognition, letter recognition, word recognition, etc... much more intuitive is the scientifically confirmed finding that they are all interrelated, and any good model factors in this interdependency; that Cmarbidge folklore just does not to it justice - it's a reverse-engineered complexity of reading - not bad per se, but what does it really prove, or help, for that matter...For example this diagram from Rumelhart & McClelland, 1982.

The real question is what it means that the brain might to some degree have a tendency to perceive words based on the images they form. Some type aims to disrupt, other to drown you in the comfy daze of centuries of convention and establishment. Both is fine, though, is it not?

Personally I was amazed to find that for example neural network theory has first been applied to the reading process some 35 years ago, and yet many practitioners of type design are still wondering if it is feature recognition, letter recognition, word recognition, etc... much more intuitive is the scientifically confirmed finding that they are all interrelated, and any good model factors in this interdependency; that Cmarbidge folklore just does not to it justice - it's a reverse-engineered complexity of reading - not bad per se, but what does it really prove, or help, for that matter...For example this diagram from Rumelhart & McClelland, 1982.

The real question is what it means that the brain might to some degree have a tendency to perceive words based on the images they form. Some type aims to disrupt, other to drown you in the comfy daze of centuries of convention and establishment. Both is fine, though, is it not?

As Hrant suggested, it's pretty obvious that the process works in paralel - bouma and letter-decomposing at the same time.

There's a lot of neo-platonism, these days, and this is one of the cases: we learn stuff by expanding correlations, and that's why the bouma model seems more likely to take the cake.

It's way more elegant to have a supercluster (a mental word image) that allows variation (within reason), instead of the massive brain effort to decompose and constantly verify a word letter-by-letter.

But what suprises me more is why are we in this age where empirism is invalid as a method.

You'll need to define which version of “empiricism” you mean before anybody can disagree. But clearly personal and individual experience is rather limited compared to experimental method, and when one is talking about how the brain works, it is awfully easy to fool oneself and hard to do double-blind experiments on oneself. (Let alone have n much greater than 1.)

You'll need to define which version of “empiricism” you mean before anybody can disagree.

But you hit Disagree anyway? :-)To be fair though I'm actually not sure what that last sentence means. Few people (and nobody in this thread AFAIK) consider empiricism pointless. But I for one do believe it's only half the picture. A favorite quote from Paul Klee: "Where intuition is combined with exact research it speeds up the process of research." I would go further and say that intuition can actually guide research (even though research might in fact end up countering it). Einstein instinctively felt he was right before he could formally prove it. The people who have doubted the parallel-letterwise model might not be Einsteins, but just because they don't have as much empirical backing doesn't make their intuition wrong. Yes, it's very easy to fool oneself in the absence of empiricism... but it's not much harder to do the same by leaning too heavily on empiricism. And now we're in a position where some notable empirical research (in fact since 2009, apparently) supports the existence of boumas, or at least whole-word reading. So unless one side can formally disprove the other side's empirical findings, intuition becomes the "tie-breaker" in terms of what one believes.

I would say the necessity of taking both empiricism and intuition seriously parallels the necessity of taking both parallel-letterwise and bouma reading seriously.

You'll need to define which version of “empiricism” you mean before anybody can disagree. But clearly personal and individual experience is rather limited compared to experimental method, and when one is talking about how the brain works, it is awfully easy to fool oneself and hard to do double-blind experiments on oneself. (Let alone have n much greater than 1.)

Fair enough: I meant that one shouldn't draw immediate conclusions from unsuficient data, nor should one discard the intuition when something feels off. On the other hand, reason alone doesn't do much.

With this said, I wasn't building towers to empiricism, nor to data-driven conclusions. Doing one of these solely is an act of faith.

They should work together, because even if we have all the data and facts in the world, we still need to explain them. And if we have all the flawless reasoning possible, we still need verification.

This might be because the latter's letter dislocations are too great (throwing off the parallel compilation) but it could instead/also be because the descender is moving too far, disrupting the bouma.

For myself, the first thing I think of is that in Cgiarbmde, the "g" has become a hard g, while in Cmarbidge, the sounds stay the same, but are re-ordered.

The idea that it is all interrelated - we see the individual letters, the bouma, the context of the words, the apparent sound values - makes perfect sense to me.

And with such a complex process, learning a new and different script means throwing most of it away until facility in the new script is acquired. So even if it would cure dyslexia, I don't think we will switch to an adaptation of the Korean writing system any time soon.

Aoccdrnig to rsceearch at Cmarbidge Uinervtisy,
it deosn't mttaer in waht oredr the ltteers in a wrod aer,
the olny ipmoratnt tinhg is taht the frist and lsat ltteer
be in the rghit pclae. The rset can be a ttaol mses and
you can sitll raed it wouthit a porlbem. Tihs is bcuseae
the hmuan mnid deos not raed ervey ltteer by istelf,
but the wrod as a wohle.

It's funny that we can still read that mess. And a bit puzzling too... how the hell do we do it?

Your brain does autocorrection and the filling-in of missing content all the time.We can read it because the brain uses past experience to predict what's coming next.It's not just the groupings of letters, it's the capitalization, the punctuation, the spaces, the need for an intelligible sentence to have a subject and a verb - the brain uses all of these and many other 'clues' as well to make meaning, come hell or high water. (If it can't make meaning, it will make stuff up.)Brains use statistical probability to extract meaning, even if you are not consciously aware of it.So, setting aside all the other clues your brain uses, think of it this way:The average length of a word in English is 5 letters. If the first and last letters are all correct, that leaves, on average, only three letters that the brain has to unscramble to figure out the word the writer deliberately misspelled. Heck, even if you only speak Russian, you could get the meaning of that passage just by trial and error unscrambling using a Russian-English dictionary.But if you do speak English, it's a piece of cake. The brain doesn't need to translate and can unscramble the letters to make meaning on the fly in light of other probabilities provided by the rules of grammar and the meaning of the words that have already been unscrambled prior to encountering the word currently under scrutiny and only if that word hasn't already been guessed at correctly by your brain without you having to unscramble any letters within it at all.

This was very interesting, thanks, Hrant.It's also in keeping with the observation that the eye jumps around and sees many, if not most words, in a passing glance or with peripheral vision only.If word shape wasn't helpful in some way, I fail to see how that could happen.

And it raises a lot of interesting questions. If a word is, indeed, remembered as a 'picture', what happens when you change the font? Different font, different picture, right? How does the brain handle that?Don't know.

Richard Fink said:If a word is, indeed, remembered as a 'picture', what happens when you change the font?

Presumably the picture is fuzzy (figuratively, but perhaps even literally). A bouma would be consistent enough across –good enough– fonts to make things click (with a frequency proportional to the quality of the font). This is why I oppose things like the "f" in Fedra (Roman), which descends. And this is not just hypothetical:

If word shape were critical, it would be hard to explain how people read all-caps text

Never critical, but "merely" helpful. And I would posit more helpful than the difference between a highly readable font and an average one. The bottom line is that if boumas are indeed read, we must design with them in mind, which is delicate work; in contrast, designing sufficiently-legible letters is child's play.

BTW letters aren't critical either. :-) Which is why we miss typos so often. It's notable that people have reported switching to a hard-to-read font when proofreading to catch mistakes; presumably this is because boumas are inhibited and letters become more central.