Excellent video. This is my summary: *Spoken language learned prior to reading (6 mo: vowels; 12 mo: consonants; 2 yr: syntax; 3 yr: well organized language system; then begin to learn to read ie associate visual symbols with access to phonemes of spoken language which already had meaning [mental imagery]). *Learning to read improves occipital cortex V1 discrimination ability (especially along retinotopic horizontal line where letters of words reside usually for English at least), and improves discrimination of phonemes (x2 activation of Planum Temporale). *Learning to read displaces some face recognition to right hemisphere. *Learning to read uses mostly labile voxels (neurons that weren't serving any purpose yet); hence it is hard for adult to learn to read new language as most of VWFA (Visual Word Form Area) and surrounding face area has been 'frozen' into dedicated other purpose use. *When adults force themselves to learn to read, they may have to use plastic neurons distributed in parietal cortex rather than the single VWFA area; as a result, they may only ever acquire letter-by-letter reading ability (children who learn to read using VWFA can read long words as fast as short words because VWFA can process each letter in parallel or in chunks). *When a word is flashed too fast to be conscious of, VWFA still lights up with representation invariant even to the specific font/type face of the word! *Eddie Cheng (2010, 2014) showed (intra-cranial recordings during surgery) that electrodes in Planum Temporale respond selectively to labial, fricative, plosive parts of phonetic speech. *We all have VWFA in same place in brain (+/- 5mm). *VWFA (/surrounding area?) responds to words, faces, objects, scenes, but specialized, as certain subregions respond only to words, some only to faces, etc. *VWFA likely evolved as a recycling of face area, which is looking for distances and directions between eyes, etc. My contribution: faces and letters may be recognized as phi deviations (imagine point below/behind the letter/face with vector direction rays going to the point; it is the phi deviations which capture 2D depth agnostic shape component). *Loss of VWFA results in 'pure alexia,' where person can still speak and understand spoken language, and can even still write, but cannot read what he or she has written. Those who lose VWFA can relearn to read with great difficulty, using parietal cortex distributed yet-plastic neurons. *VWFA is the area where it becomes abstract enough to abstract away word from font, and to abstract away the word from wherever it occurred on retina/retinotopic V1. V1 is only region capable of resolving fine print. *PFC/IPS have neurons that prefer two objects, or five objects. We recycle parts of these areas to do math.
What would be interesting to observe is changes in brain function when a child or adult experiences significant changes to their motor-sensory integration i.e. changes to their motor skills control, or sound processing skills or binocular vision and visual processing skills or all areas. We have clients who have moved from ADHD, Dyslexic etc to efficient learner through an integrated programme. Inputs to the brain is an important factor that does not seem to be considered.
There are people who cannot read in all fonts or find it much more difficult to identify letters and words in some fonts compared to others. Serif more difficult than sans serif, for example.
Excellent video. This is my summary:
*Spoken language learned prior to reading (6 mo: vowels; 12 mo: consonants; 2 yr: syntax; 3 yr: well organized language system; then begin to learn to read ie associate visual symbols with access to phonemes of spoken language which already had meaning [mental imagery]).
*Learning to read improves occipital cortex V1 discrimination ability (especially along retinotopic horizontal line where letters of words reside usually for English at least), and improves discrimination of phonemes (x2 activation of Planum Temporale).
*Learning to read displaces some face recognition to right hemisphere.
*Learning to read uses mostly labile voxels (neurons that weren't serving any purpose yet); hence it is hard for adult to learn to read new language as most of VWFA (Visual Word Form Area) and surrounding face area has been 'frozen' into dedicated other purpose use.
*When adults force themselves to learn to read, they may have to use plastic neurons distributed in parietal cortex rather than the single VWFA area; as a result, they may only ever acquire letter-by-letter reading ability (children who learn to read using VWFA can read long words as fast as short words because VWFA can process each letter in parallel or in chunks).
*When a word is flashed too fast to be conscious of, VWFA still lights up with representation invariant even to the specific font/type face of the word!
*Eddie Cheng (2010, 2014) showed (intra-cranial recordings during surgery) that electrodes in Planum Temporale respond selectively to labial, fricative, plosive parts of phonetic speech.
*We all have VWFA in same place in brain (+/- 5mm).
*VWFA (/surrounding area?) responds to words, faces, objects, scenes, but specialized, as certain subregions respond only to words, some only to faces, etc.
*VWFA likely evolved as a recycling of face area, which is looking for distances and directions between eyes, etc. My contribution: faces and letters may be recognized as phi deviations (imagine point below/behind the letter/face with vector direction rays going to the point; it is the phi deviations which capture 2D depth agnostic shape component).
*Loss of VWFA results in 'pure alexia,' where person can still speak and understand spoken language, and can even still write, but cannot read what he or she has written. Those who lose VWFA can relearn to read with great difficulty, using parietal cortex distributed yet-plastic neurons.
*VWFA is the area where it becomes abstract enough to abstract away word from font, and to abstract away the word from wherever it occurred on retina/retinotopic V1. V1 is only region capable of resolving fine print.
*PFC/IPS have neurons that prefer two objects, or five objects. We recycle parts of these areas to do math.
What would be interesting to observe is changes in brain function when a child or adult experiences significant changes to their motor-sensory integration i.e. changes to their motor skills control, or sound processing skills or binocular vision and visual processing skills or all areas. We have clients who have moved from ADHD, Dyslexic etc to efficient learner through an integrated programme. Inputs to the brain is an important factor that does not seem to be considered.
instablaster.
Great video. Thanks
There are people who cannot read in all fonts or find it much more difficult to identify letters and words in some fonts compared to others. Serif more difficult than sans serif, for example.