Review of Phonological Processing in Sign Languages and Its Neural Mechanisms

Zhang Xiaohong, Li Hong

Journal of Psychological Science ›› 2023, Vol. 46 ›› Issue (4) : 1017-1023.

PDF(991 KB)
PDF(991 KB)
Journal of Psychological Science ›› 2023, Vol. 46 ›› Issue (4) : 1017-1023. DOI: 10.16719/j.cnki.1671-6981.202304031
Theories & History of Psychology

Review of Phonological Processing in Sign Languages and Its Neural Mechanisms

  • Zhang Xiaohong, Li Hong
Author information +
History +

Abstract

Phonological processing concerns access to and use of mental phonological representation, which is essential to language comprehension and production. However, most findings and theories on phonological processing are based on spoken or written languages, with relatively little evidence from sign languages. As natural human languages, sign languages can also be analyzed at a phonological level. But they are articulated in a visual-gestural modality and have some unique features different from spoken languages. For example, 1) the manual and largely simultaneous articulation of phonological units or parameters (e.g., location, handshape, and movement), and 2) a tight relation between phonology and semantics due to the prevalence of iconicity, i.e., sign forms visually resemble their meanings. These features may result in some phonological processing mechanisms different from those found in spoken languages, hence posing challenges to current language processing theories. To reveal what is fundamental and what is modality-specific about language processing, this article attempts to review recent works on phonological processing in sign languages, with a focus on its empirical evidence and neurological mechanisms.
On the one hand, studies from different paradigms and techniques show that parameter-based phonological information is activated and used for sign recognition. First, signs with higher phonological neighborhood density are recognized slower than those with low density, indicating that phonological competitors are activated and compete for identification. Priming studies have found that prime-target sign pairs with one- or two-parameter overlap are responded to at a different speed or elicit different amplitude N400s from those unrelated pairs. In addition, eye-tracking studies using the visual world paradigm show that, compared to unrelated pictures, participants spend more time looking at competing pictures whose corresponding signs share one or two parameters with the target signs. These findings indicate that phonological processing in sign languages is psychologically real, and that the processing units may involve both individual parameters and two-parameter combinations. However, results are mixed regarding the actual effects of the units, with some showing facilitative effects of location overlap, some showing inhibitory effects, while others showing no effects of location but facilitative effects of handshape and location-handshape overlap.
On the other hand, brain image studies using PET, fMRI and TMS have showed that sign languages share similar neurological mechanisms of phonological processing with spoken languages. In sign perception and comprehension tasks, signers tend to activate the superior temporal cortex bilaterally when viewing signs or sign-phonetic and syllabic units. In explicit phonological judgement tasks, a left-lateralised network is engaged, including the left inferior frontal cortex, superior marginal gyrus, and the superior parietal lobule. These processing regions have also been found in speech sound perception and word rhyme judgement, indicating that there are some similar phonological processing mechanisms in the two language modalities. Differences also exist, though, at the whole brain level, significant activation is found in the left occipital lobe in handshape judgement but not in the frontal-parietal network and the cerebellum bilaterally in speech rhyme judgement. Yet it is unclear whether these differences are related to modality-specific processing mechanisms or the sensory properties of phonological parameters.
To sum up, research available has provided some evidence for the psychological reality and neural mechanisms of phonological representation and processing in sign languages, but there is still a lack of consistency about the processing units and their roles as well as evidence on the modality-specific processing mechanisms. Thus some suggestions are given for future studies: (1)Future investigation may look at the roles of phonological parameters; (2) Further exploration of the neural mechanisms of sign phonology processing, especially on the brain activities in specific parameter perception and comparison of mechanisms may invoke in different tasks; (3)Enrichment of studies on various kinds of sign languages, as different languages may have their own unique phonological, grammatical or syntactical properties, which may result in cross-linguistic differences in processing mechanisms.

Key words

sign languages / phonological processing / phonological parameters / neural mechanisms

Cite this article

Download Citations
Zhang Xiaohong, Li Hong. Review of Phonological Processing in Sign Languages and Its Neural Mechanisms[J]. Journal of Psychological Science. 2023, 46(4): 1017-1023 https://doi.org/10.16719/j.cnki.1671-6981.202304031

References

[1] 龚群虎. (2005). 双语聋教育的理论和实践. 华夏出版社..
[2] 顾笙韵, 张吉生. (2017). 手语音系研究及其理论模型. 外国语, 40(1), 52-65.
[3] 李俊宏, 丁国盛. (2013). 手语和口语理解及产生的脑机制对比. 心理科学进展, 21(9), 1560-1569.
[4] 祁志强, 彭聃龄. (2010). 语音加工的脑机制研究: 现状、困惑及展望. 北京师范大学学报(社会科学版), 4, 40-47.
[5] 邱云峰, 姚登峰, 李荣, 刘春达. (2018). 中国手语语言学概论. 中国国际广播出版社..
[6] Andin J., Fransson P., Dahlström Ö., Rönnberg J., & Rudner M. (2019). The neural basis of arithmetic and phonology in deaf signing individuals. Language, Cognition and Neuroscience, 34(7), 813-825.
[7] Andin J., Fransson P., Rönnberg J., & Rudner M. (2015). Phonology and arithmetic in the language-calculation network. Brain and Language, 143, 97-105.
[8] Anthony J. L., Williams J. M., Aghara R. G., Dunkelberger M., Novak B., & Mukherjee A. D. (2010). Assessment of individual differences in phonological representation. Reading and Writing, 23(8), 969-994.
[9] Banaszkiewicz A., Bola Ł., Matuszewski J., Szczepanik M., Kossowski B., Mostowski P., Rutkowski P., Sliwinska M., Jednorog K., Emmorey K., & Marchewka A. (2021). The role of the superior parietal lobule in lexical processing of sign language: Insights from fMRI and TMS. Cortex, 135, 240-254.
[10] Brentari, D. (2012). Phonology. In R.Pfau, M. Steinbach, & B. Woll (Eds.), Sign language: An international handbook (pp. 21-54). De Gruyter Mouton.
[11] Brentari, D. (2019). Sign language phonology. Cambridge University Press..
[12] Cardin V., Orfanidou E., Kästner L., Rönnberg J., Woll B., Capek C. M., & Rudner M. (2016). Monitoring different phonological parameters of sign language engages the same cortical language network but distinctive perceptual ones. Journal of Cognitive Neuroscience, 28(1), 20-40.
[13] Cardin V., Orfanidou E., Rönnberg J., Capek C. M., Rudner M., & Woll B. (2013). Dissociating cognitive and sensory neural plasticity in human superior temporal cortex. Nature Communications, 4(1), Article 1473.
[14] Carreiras M., Gutiérrez-Sigut E., Baquero S., & Corina D., (2008). Lexical processing in Spanish Sign Language (LSE). Journal of Memory and Language, 58(1), 100-122.
[15] Caselli N. K., Emmorey K., & Cohen-Goldberg A. M. (2021). The signed mental lexicon: Effects of phonological neighborhood density, iconicity, and childhood language experience. Journal of Memory and Language, 121, Article 104282.
[16] Dye, M. W. G., & Shih, S. I. (2006). Phonological priming in British Sign Language. In L. M. Goldstein, D. H. Whalen, & C. T. Best (Eds.), Laboratory phonology 8 (pp. 241-264). Mouton de Gruyter.
[17] Emmorey K., Xu J., & Braun A. (2011). Neural responses to meaningless pseudosigns: Evidence for sign-based phonetic processing in superior temporal cortex. Brain and Language, 117(1), 34-38.
[18] Fenlon J., Cormier K., & Brentari D. (2018). The phonology of sign languages. In S. J. Hannahs & A. R. K. Bosch (Eds.), The Routledge handbook of phonological theory (pp. 453-475). Routledge.
[19] Grushkin, D. A. (2017). Writing signed languages: What for? What form? American Annals of the Deaf, 161(5), 509-527.
[20] Gutiérrez E., Müller O., Baus C., & Carreiras M. (2012). Electrophysiological evidence for phonological priming in Spanish Sign Language lexical access. Neuropsychologia, 50(7), 1335-1346.
[21] Hickok, G., & Poeppel, D. (2007). The cortical organization of speech processing. Nature Reviews Neuroscience, 8(5), 393-402.
[22] Hu Z. G., Wang W. J., Liu H. Y., Peng D. L., Yang Y. H., Li K. C., & Ding G. S. (2011). Brain activations associated with sign production using word and picture inputs in deaf signers. Brain and Language, 116(2), 64-70.
[23] Liddell, S. K., & Johnson, R. E. (1989). American Sign Language: The phonological base. Sign Language Studies, 64(1), 195-278.
[24] Lieberman, A. M., & Borovsky, A. (2020). Lexical recognition in deaf children learning American Sign Language: Activation of semantic and phonological features of signs. Language Learning, 70(4), 935-973.
[25] Liu L. F., Yan X., Liu J., Xia M. R., Lu C. M., Emmorey K., Chu M., & Ding G. S. (2017). Graph theoretical analysis of functional network for comprehension of sign language. Brain Research, 1671, 55-66.
[26] MacSweeney M., Capek C. M., Campbell R., & Woll B. (2008). The signing brain: The neurobiology of sign language. Trends in Cognitive Sciences, 12(11), 432-440.
[27] MacSweeney, M., & Emmorey, K. (2020). The neurobiology of sign language processing. In D. Poeppel, G. R. Mangun, & M. S. Gazzaniga (Eds.), The cognitive neurosciences (pp. 849-857). The MIT Press.
[28] MacSweeney M., Waters D., Brammer M. J., Woll B., & Goswami U. (2008). Phonological processing in deaf signers and the impact of age of first language acquisition. NeuroImage, 40(3), 1369-1379.
[29] Meade G., Lee B., Massa N., Holcomb P. J., Midgley K. E., & Emmorey K. (2021). The organization of the American Sign Language lexicon: Comparing one- and two-parameter ERP phonological priming effects across tasks. Brain and Language, 218, Article 104960.
[30] Newman A. J., Supalla T., Fernandez N., Newport E. L., & Bavelier D. (2015). Neural systems supporting linguistic structure, linguistic experience, and symbolic communication in sign language and gesture. Proceedings of the National Academy of Sciences of the United States of America, 112(37), 11684-11689.
[31] Ortega, G. (2017). Iconicity and sign lexical acquisition: A review. Frontiers in Psychology, 8, Article 1280.
[32] Perlman M., Little H., Thompson B., & Thompson R. L. (2018). Iconicity in signed and spoken vocabulary: A comparison between American Sign Language, British Sign Language, English, and Spanish. Frontiers in Psychology, 9, Article 1433.
[33] Petitto L. A., Zatorre R. J., Gauna K., Nikelske E. J., Dostie D., & Evans A. C. (2000). Speech-like cerebral activity in profoundly deaf people processing signed languages: Implications for the neural basis of human language. Proceedings of the National Academy of Sciences of the United States of America, 97(25), 13961-13966.
[34] Price, C. J. (2012). A review and synthesis of the first 20 years of PET and fMRI studies of heard speech, spoken language and reading. NeuroImage, 62(2), 816-847.
[35] Rudner M., Karlsson T., Gunnarsson J., & Rönnberg J. (2013). Levels of processing and language modality specificity in working memory. Neuropsychologia, 51(4), 656-666.
[36] Rudner M., Orfanidou E., Kästner L., Cardin V., Woll B., Capek C., & Rönnberg J. (2019). Neural networks supporting phoneme monitoring are modulated by phonology but not lexicality or iconicity: Evidence from British and Swedish Sign Language. Frontiers in Human Neuroscience, 13, Article 374.
[37] Sandler, W. (2017). The challenge of sign language phonology. Annual Review of Linguistics, 3, 43-63.
[38] Thierfelder P., Wigglesworth G., & Tang G. (2020). Sign phonological parameters modulate parafoveal preview effects in deaf readers. Cognition, 201, Article 104286.
[39] Thompson R. L., Vinson D. P., Fox N., & Vigliocco G. (2013). Is lexical access driven by temporal order or perceptual salience? Evidence from British Sign Language. Proceedings of the 35th annual meeting of the cognitive science society, Berlin, Germany.
[40] van der Kooij, E. (2002). Phonological categories in Sign Language of the Netherlands: The role of phonetic implementation and iconicity (Unpublished Doctorial dissertation). Leiden University.
[41] Villameriel S., Costello B., Dias P., Giezen M., & Carreiras M. (2019). Language modality shapes the dynamics of word and sign recognition. Cognition, 191, Article 103979.
[42] Wienholz A., Nuhbalaoglu D., Steinbach M., Herrmann A., & Mani N. (2021). Phonological priming in German Sign Language: An eye tracking study using the Visual World Paradigm. Sign Language & Linguistics, 24(1), 4-35.
[43] Winsler K., Midgley K. J., Grainger J., & Holcomb P. J. (2018). An electrophysiological megastudy of spoken word recognition. Language, Cognition and Neuroscience, 33(8), 1063-1082.
PDF(991 KB)

Accesses

Citation

Detail

Sections
Recommended

/