This corpus is under development at the University of Cologne, as part of the project *"Gestures or Signs? Comparing Manual and Non-manual Constructions Sharing the Same Form in Co-speech Gesture and Sign Language: A Corpus-driven Approach"*, funded by the DFG within the ViCom priority programme, The focus is on both manual (hand signs and hand gestures) and non-manual (eye movements, facial expressions) constructions that share similar forms across the communicative modalities.
The corpus consists of video and audio recordings of spontaneous conversations between L1 German speakers. Participants are free to choose their conversation topics, promoting natural interaction and gesture use. The first recording has already been completed, and more sessions are planned for the near future to expand the data set.
In addition to the audiovisual recordings, the PupilLabs eye-tracking system is used to capture detailed gaze behavior, providing insights into mutual gaze between participants, fixations on body parts during gestural communication, blink events, and other non-verbal cues.
The recordings are meticulously annotated, including speech transcriptions, gesture identification, and coding of non-manual signals like gaze and facial expressions. The aim is to create a comprehensive, corpus-driven analysis of the interaction between manual and non-manual constructions in both spoken and sign languages.