From The Daily Yomiuri: Robotic hand translates speech into sign language
An 80-centimeter robotic hand that can covert spoken words and simple phrases into sign language has been developed in a town in Fukuoka Prefecture.
…
A microchip in the robot recognizes the 50-character hiragana syllabary and about 10 simple phrases such as “ohayo” (good morning) and sends the information to a central computer, which sends commands to 18 micromotors in the joints of the robotic hand, translating the sound it hears into sign language.
…
The robot was shown to teachers at the school in December to ensure that its sign language was understandable.
That last comment is especially interesting to me. It seems that the translation are nowhere near perfect, and is based almost entirely on words and phrases, and not on statements or meanings. On any standard account, this would imply that the machine isn’t really doing a translation at all, but just performs the function mapping words in Japanese to movements of the robotic arm. But that misses the essential point of communication: that the message conveyed is actually understood by the the interlocutor.