Accessibility Tools

Skip to main content

Gesture or signs?

The project Gestures or signs? Comparing manual and non-manual constructions sharing the same form in co-speech gesture and sign language: a corpus-driven approach is funded by the German Science Foundation and is a part of the DFG Priority Programme 2392 Visual Communication. It runs from 2022-2025.

There is a number of manual and non-manual constructions which are not constrained to signed language only, they are also observed in co-speech gesture. The following forms will be systematically treated in this project: palm-up, throw away, pointing, list buoys, eyebrow raise and sidewards body leans. While some of these selected constructions have been researched, other forms remain un(der)studied and there is no fine-graded comparison of all these constructions between signers and speakers. This project fills this gap. It aims at providing a detailed corpus-based analysis of the selected constructions in both co-speech gesture and sign. The goal is to determine how – and to what degree – these constructions in sign language differ from comparable forms in gesture on both functional and formation grounds. As these constructions share the same modality, they will be given the same theoretical treatment and will be investigated along a number of various dimensions blurring the strict gesture-sign binary and backing the understanding of these manual and non-manual activities as forming a cross-modal continuum along which functional conventionalization and lexicalization takes place. Distinguishing signs and gesture, the two prime examples of visual communication, this project also provides further insights into the interaction of different channels and the grammatical system(s) underlying this interaction contributing to a new modality-free comprehensive theoretical model of language and communication

 

Team

Anastasia Bauer – principal investigator
Anna Kuder – postdoctoral researcher
Roman Poryadin – research assistant
Milena Pielen - student assistant

Smiling and laughter in spoken and signed interaction

Project description

In this project we focus on the extent to which different sign languages vary in the use of smiles and laughter functioning as feedback and/or alignment. We assume that the primary function of alignment smiles is to show similarity and togetherness (Bavelas et al. 1986), while the feedback smiles are used mostly as continuers and laughter as assessment. Feedback is known to vary considerably across individuals and across different types of contexts (Dideriksen et al 2023, Blomsma et al. 2024). This study aims to assess the possible cross-linguistic/cross-cultural differences in pragmatic functions of smiling behavior in spontaneous signed dyadic face-to-face conversations. We quantify and analyze the variability of different smiling behaviors in face-to-face interactions in three sign languages: German (DGS), Russian (RSL) and Polish (PJM).


We use the data extracted from the corpora of the three languages (Konrad et al 2022, Bauer & Poryadin 2023; Kuder et al 2022). Each data sample consists of approx. 1 hour of dyadic conversational data per language. We identified and annotated smiles in spontaneous conversational interactions of three signed languages and labeled them following the Smiling Intensity Scale (Gironzetti, Attardo, and Pickering, 2016). According to it, we differentiate four subtypes of smiling behaviors: closed mouth smile (s1), open mouth smile (s2), wide open mouth smile (s3) and laughing smile (s4).  In each of the data samples we annotate all instances of smiling behavior in both participants, differentiating between smiles produced as feedback signals (when the addressee smiling is present only) and in alignment (when both interlocutors smile). As alignment cases we count all of the occurrences that were produces up to 300ms after the initial smile produced by the other interlocutor.

Collaborations with Anna Kuder

Latest outcome

Anna Kuder & Anastasia Bauer 2024. Smiling in spontaneous dyadic signed interaction: disentangling feedback and alignment functions. Expressing emotions in sign languages. Institute für Deutsche Gebärdensprache, Hamburg, 45th July, doi: 10.13140/RG.2.2.24668.19848.

A Corpus-based contrastive analysis of PALM-UP forms and functions in Polish Sign Language, German Sign Language and Russian Sign Language Narratives

Palm-up is a multifunctional manual activity taking the form of rotating one’s forearms so that the palms of the hand face upward (e.g., Cooperrider et al., 2018 among others). The movement patterns of the gesture can vary, that is, the hand(s) can move forward, downward or laterally to the sides. They can include a wrist movement and sometimes a hold at the end. According to Kendon (2004), all of these forms belong to the open hand supine gesture family.

 Palm up gestures occur in many different languages of the world and also languages of different modalities, that is both spoken and sign language users produce this gesture form when they communicate. We aim for a large-scale and entirely corpus-based study across multiple sign languages to compare the use of the Palm-Up. 

Collaboration with Anna Kuder, Pamela Perniss

Latest outcome

Kuder, Anna, Anastasia Bauer & Pamela Perniss. (to be submitted to October 2024). Palm-up in Polish, German and Russian Sign Language.

Kuder, Anna, Anastasia Bauer & Pamela Perniss. 2022. Gesture, Sign, Visible Action? A Corpusbased Comparative Study of Palm-ups and Throw-Away in Polish, German and Russian Sign Language. TISLR14 Poster presented on the 26th September, Osaka, Japan, DOI: 10.13140/RG.2.2.33841.86887.

Bauer,  Anastasia. 2019. Nonmanual components with palm-up in Russian Sign Language. Poster at the 93rd LSA Annual Meeting. New York, NY. 3.–7. January, 10.13140/RG.2.2.30528.79363