Three papers from NONMANUAL at LREC-SL 2024
Main content
In May 2024, the 11th Workshop on the Representation and Processing of Sign Languages: Evaluation of Sign Language Resources (LREC-SL 2024: https://www.sign-lang.uni-hamburg.de/lrec2024/programme.html) took place in Torino, Italy. It is an important venue for computational and general linguists working on sign languages.
The NONMANUAL project was represented by three poster presentations.
- Kimmelman, V., M. Oomen & R. Pfau (2024) Headshakes in NGT: Relation between Phonetic Properties & Linguistic Functions Proceedings of LREC-SL 2024. https://www.sign-lang.uni-hamburg.de/lrec/pub/24008.html
We use OpenFace to measure head rotation during headshakes expressing negation in NGT (Sign Language of the Netherlands). We find that some of the phonetic/kinetic measures of headshakes correlate with their linguistic functions.
- Kimmelman, V., A. Price, J. Safar, C. de Vos & J. Bulla (2024) Nonmanual Marking of Questions in Balinese Homesign Interactions: a Computer-Vision Assisted Analysis. Proceedings of LREC-SL 2024. https://www.sign-lang.uni-hamburg.de/lrec/pub/24009.html
We look at nonmanual marking of questions in 5 deaf homesigners from Bali. It turns out that polar questions and non-polar questions are marked by opposite directions of head movements, and this is consistent across homesigners. The analysis is based on a combination of manual annotation and extracting measurements of head tilt (pitch) with OpenFace.
- Susman, M. & V. Kimmelman. (2024) Eye Blink Detection in Sign Language Data Using CNNs and Rule-Based Methods Proceedings of LREC-SL 2024. https://www.sign-lang.uni-hamburg.de/lrec/pub/24005.html
Eye blinks are an important prosodic markers across sign languages. However, cross-linguistic research on these markers is almost non-existent. In order to enable cross-linguistic comparison, we develop and test two methods of automatic detection of eyeblinks. Both methods produce promising results.
The three studies are published in proceedings of the workshop, and available in open access. Follow the links above to read the papers!