Thread: "Lack of selectivity for syntax relative to word meanings throughout the language network” https://tinyurl.com/y77cuznv  is out in Cognition and is a culmination of one line of work, which I have been pursuing for the last ~15 years! Here is the story of how it all unfolded.
When I started working with @Nancy_Kanwisher on language, ca. 2006, I brought with me priors—a combo of early training in generative grammar, and time spent working on sentence processing in @LanguageMIT where the focus had then been on understanding syntactic complexity.
Combined with the leading proposals of the neural architecture of language, these priors led me to expect a clear separation in the mind and brain between mechanisms that handle the processing of word meanings, and mechanisms that support syntactic/combinatorial processing.
We started with a simple paradigm crossing word meanings and structure, looking at responses to stimuli that have both (sentences), stimuli that just have word meanings (word-lists), stimuli that just have structure (“Jabberwocky”), and stimuli that have neither (nonword-lists).
We looked for brain regions selective for word meanings (Sentences+Word-lists>Jabberwocky+Nonword-lists), and other regions selective for structure (Sentences+Jabberwocky>Word/Nonword-lists. I spent 2-3 years looking for this dissociation, using diverse analyses, in vain.
No brain region, or even a set of non-contiguous voxels, showed selectivity for structure (some regions showed stronger responses to word meanings). And we found a similar pattern using a method with high spatial and high temporal resolution (ECoG): https://tinyurl.com/y8kjvp3m .
2 other things happened in parallel. 1. We found that lang regions don't respond to non-ling stimuli that contain hierarchical structural relations, like music and math, putting into question the idea of a brain region that supports abstract structural processing across domains.
2. I was reading the lit that questioned the syntax/lexico-semantic divide. This lit is vast, spanning linguistics (Sag, Bybee, @adelegoldberg1, Jackendoff, etc.), experimental work in development, adults, individuals with aphasia (incl. Liz Bates' great work) and comp. modeling.
Combined with our fMRI+ECoG data, I became convinced that the dissociation does not hold. The ‘pushback’ I typically got when presenting these data concerned other paradigms commonly used in cog.neuro. of language: synt/sem violations, adaptation to word meanings vs. syntax, etc.
So we collected fMRI data on 3 paradigms: the 2 just mentioned, and a paradigm from Dapretto&Bookheimer (1999), a paper oft cited as evidence of syntax/semantics dissociation (see https://tinyurl.com/yavfxc8k  @MattSiegelman for evidence of non-replication of the original findings).
The results were unequivocal: every region of the language network was sensitive to both sentence structure and word meanings (often showing stronger responses to the latter), but no region responded selectively to structure.
Considering the current empirical landscape of the field, I don’t think any (robust and replicable) results remain that show that some brain region is selective for syntactic processing.
And evidence is accumulating for the dominance of linguistic behavioral and neural responses by the *meaning* in the signal, in line with the key goal of language — to share meanings with other minds.
P.S. More work is needed-searching for syntax selectivity at the level of cortical layers, circuits, or neurons-which we’re pursuing, and probing lexsem-vs-synt processing in production (in-prep work by JennHu, @smallhannahe et al. from our lab paints a similar picture there).
You can follow @ev_fedorenko.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: