IEEE World Haptics 2017

Interview with Sven Topp

Sven Topp is one of the most fascinating and inspiring participants of World Haptics Conference this year. Despite his disability, he is able to attend the conference and present his work in the demo session. Sven is deafblind. As a result of meningitis, he lost his vision at age 10 leaving him with only 5% of peripheral vision in his right eye and no vision in his left eye. Then at age 13, his hearing deteriorated very rapidly. He does perceive some low frequencies, but that is more a matter of feeling rather than hearing. Sven travelled to Munich from Australia, together with three interpreters that translate for him using haptic sign language.

Why did you choose for a career in Haptics?

Sven: I am interested in computer science and technology. My computer science degree had a very broad range of courses combining many different areas including psychology, industrial design, human computer interfaces, robotics and hardware design/programming. But my interest for tactile communications is mainly the result of my disability. Haptic sign language is my method for everyday communication. I basically live through haptics.

How do you communicate with your interpreters?

Sven: We communicate through a haptically unique series of symbols, that includes slides and taps. Each individual symbol represents either a letter, a word or a whole phrase. For example, we have symbols for the words “haptics”, “meningitis” and “important”. But a tap on the fingertip means (depending on which finger) one of the letters “a”, “e”, “i”, “o”, “u”. The haptic language is based on the Auslan sign language and used by other deafblind individuals. But since my vocabulary involves a lot of technical terms, we had to expand the language to be able to communicate all the technical information. My interpreters keep adding words, so our haptic language is constantly evolving.

What are you currently working on?

Sven: Together with Vincent Hayward, we work on a device that emulates tactile communication. The way my interpreters communicate is physically straining for me, and has resulted in a bad shoulder. We created a more ergonomic device, that I hold in a similar way as a mouse. With this device, an actuator talks underneath my hand and puts a lot less strain on my shoulder. This system works with a keyboard entry that directly translates into a pattern on the hand. The great benefit here is that there is no learning curve, since the haptic language is identical to what I am already using with my interpreters. Hopefully in the future this same principle will work with speech recognition.

Do you have hobbies/interests outside of haptics?

Sven: I like reading, computer animation and 3D modeling and playing chess. I learned to play chess before I became deafblind and still enjoy that game. I also used to do Taekwondo and got a red belt.

What are your future goals?

Sven: Now, when an interpreter gives input, it’s not like typing or Braille. It’s not monotonous. Our haptic communication has expressions, emphases, emotions, and rhythm information of speech. I want to add that to the device. By varying the length of taps and the strength of vibrations, I can enable it to talk to me with emotional content.

Interview by Jasper van de Lagemaat