From Lordi’s brave repudiation of religious dogma with “Hard Rock Hallelujah” to Johnny Logan’s treatise on space-time with “What’s Another Year,” the Eurovision is often credited for introducing brave new concepts to the world. The Eurovision also proved to be a muse for one LITHME-associated researcher in the field of sign language processing.
Amit Moryossef is a PhD student at Bar-Ilan University, in Israel, who is currently doing a Short Term Scientific Mission at The University of Zurich (UZH) as part of the LITHME scheme. His research deals with the topic of sign language processing, specifically to do with sign language translation and understanding, and his inspiration came from the most flamboyant of places.
“Back in 2019, the Eurovision song contest took place in Israel, including sign language interpretations for the songs. I talked about it with a friend of mine who is an Israeli Sign Language interpreter, and at some point realised I don’t know any translation system for sign languages. There is no Google Translate for sign languages.
Surprised by this revelation, I said “ok then, I’ll go back home, and build you something”. I thought it would take a week, and I’ve been here ever since!”
Starting his PhD in 2019, Amit has also completed a remote internship with Google Zurich from May 2020 to March 2021. It was while he was doing his internship he heard about research relating to sign languages at the University of Zurich.
“It took a lot with Covid to figure out how to come here but eventually I’m here. Now I’m finished with Google but started working with the University, working together: doing something more in-person, more involved, and more strict with regards to time.”
Amit‘s contact in Zurich pointed him towards LITHME in order to help fund his Short Term Scientific Mission. Amit is still grateful for the support he is receiving from LITHME.
Two things that Amit is very clear about are his future plans and the future of this area in particular. After his STSM he will return to Israel to finish his PhD, and he remains very optimistic about the direction that sign language processing is going in.
“Previously I was telling people I’m building Google Translate for sign languages. It’s easy for people to understand. As time passes, I’m also thinking that I really want to make content accessible online: One goal would be an interpretation software for any video online, such that a deaf person would be able to open Youtube or Netflix, and with a click-of-a-button, the content will be interpreted for them. It won’t be perfect but will slowly improve. Or this can be a part of something like Google Translate in which you could write and it will generate a sign, or you could sign and it will translate into spoken language text. That’s the future, I feel.”
The use of animated avatars or figures for interpreting sign languages from spoken languages has proven to be a contentious issue. It is believed there are ethical issues regarding the use of avatars for interpreting sign languages, issues that Amit himself is mindful of.
“I can understand the concern regarding having cartoon-looking mechanically-moving avatars that perform signs one-after-another in a deterministic way, and that usually only focus on the hands in a sign language, and not at all the face, and I think these concerns are valid. I, as a hearing person, wouldn’t like a robotic text-to-speech system from the early 2000s to read me text as well.
But as technology progresses, I can see this becoming less of an issue. Recent video games have shown that it is possible to create very convincingly human characters with many articulators throughout the body, and so I think it is a valid and worthwhile direction to work with avatars, with the hope that with progress, one could combine the sign language production techniques with realistic-looking and realistic-moving avatars.“
Amit is researching at the University of Zurich for three months
and his scientific report will be published on the LITHME website after he completes his STSM.
Amit Moryossef was interviewed by LITHME intern Peadar Faherty.