IEEE World Haptics 2017

Demonstrations

Click on the title to show the abstract for each demonstration.

Demonstrations 1

D1.01 WoodenHaptics 1.5 featuring USB and vintage device support
Jordi Solsona Belenguer1, Jonas Forsslund1, Ellinore Seybolt2, Alexander Jonsson2
1Forsslund Systems AB, 2KTH Royal Institute of Technology

 

Abstract: WoodenHaptics was first presented at the ACM TEI conference 2015 as a module-based open source hardware spatial haptic device. In this demo an updated version is shown featuring the new full USB support making it feasible to use it with a standard laptop and a consumer power supply. This demo will feature three haptic devices that will use the same modular electronics box that we have designed. First, the open-source WoodenHaptics device that is designed for modification in shape and size as well as tuning of parameters to fit a particular task or application. Second, a compact work-in-progress aluminum-based device that has embedded the electronics inside. And finally we show how the vintage Phantom Premium haptic device can be given USB-support using just two additional passive adapters and our electronics box. The demo applications are from the open-source software library Chai3D 3.2 and will be shown on two laptops running Ubuntu 16.04. The schematics, firmware code and software drivers can all be downloaded from woodenhaptics.org and used under a creative-commons share-alike license.

 

D1.02 A Ball and Beam Module for a Haptic Paddle Education Platform
Nathan Bucki1, Chad Rose1, Marcia O'Malley1
1Rice University

 

Abstract: Single degree of freedom haptic, force-feedback mechatronic devices, often referred to as haptic paddles, are used as teaching tools in traditional university curriculum as well as massive open online courses (MOOCs). While devices differ based on the specific goals of a given course, broadly speaking they provide ‘hands-on’ learning for students studying mechatronics and dynamics. This demonstration will allow participants to interact with the third iteration of the Haptic Paddle at Rice University and the virtual environments used in course curriculum. The improvement in closed loop performance enables additional experimental plants to be interfaced with the haptic paddle base, which can be directed at advanced dynamics and controls courses, or special topics in mechatronics and haptics. This demo will include the Haptic Ball and Beam, a dynamic plant for teleoperation demonstrations with the haptic paddle, a multi-input plant for implementing more complex control structures, and a testbed for haptic motor learning experiments in undergraduate coursework. Interactions with this module will include haptic rendering of a virtual ball, as well as bilateral teleoperation.

 

D1.03 Portable Haptic Display with 12x16 Tactile Pins
Juan Jose Zarate1, Nadine Besse1, Olexandr Gudozhnik1, Anthony Sebastien Ruch1, Herbert Shea1
1École polytechnique fédérale de Lausanne

 

Abstract: We present a haptic display based on electromagnetic (EM) actuators, designed to provide graphical information to blind and visually impaired users. We demonstrate a 15 cm x 14 cm x 4 cm size haptic display with 192 taxels on an 8 mm pitch. Each EM actuator consists of a shielded permanent magnet that moves between two stable latched positions. To switch from up to down or vice versa, planar PCB coils are actuated with 10 ms pulses of electrical current. All pins have a measured holding force of over 200 mN and 0.8 mm of displacement. Latching enables low power operation: the device is completely wireless, running off rechargeable batteries. Using PCB technology ensure scalability and low cost. The taxel state (up/down) is easy to feel, the device has an extremely fast refresh rate (less than 20 ms), allowing for dynamic patterns of moving or “blinking” taxels, is truly portable, and enables a broad range of scenarios. We will present a similar prototype of this technology at CHI 2017, Interactivity Demo. Here we extend the concept to a completely wireless device and with new scenarios, including zooming in/out maps and new serious games.

 

D1.04 Innovative SMA haptic button for a local, customisable and HD feeling
Tom Powell1, Andrea Cantone1
1Cambridge Mechatronics Ltd

 

Abstract: Cambridge Mechatronics Ltd. presents local, direct-impulse haptic button technology delivered by Shape Memory Alloy (SMA) wire with a small actuator footprint. Present-day haptic technologies for handheld and wearable customer devices operate by propagating small vibrations through the whole device. This performs well for crude applications such as text alerts on a smartphone or alarms on a smartwatch, but is poor at emulating localised events such as button clicks; the whole device shakes in response to a button activation event, which feels unnatural. In CML’s SMA Haptic Button design and provided technology demonstration, SMA wire is attached to the button in an integrated design with a small footprint. When the button is pressed, a short pulse is sent to the SMA wire. The wire heats up and rapidly contracts due to the Shape Memory effect, moving the button and delivering a single, quick impulse directly to the user’s finger without disturbing the rest of the device, faithfully emulating a button click sensation. By modifying the pulse duration, pulse shape and drive voltage, the haptic sensation is customisable.

 

D1.05 Spatial guidance of the human arm motion using vibrotactile feedback
Denis Cehajic1, Satoshi Endo1, Marco Aggravi2, Domenico Prattichizzo2, Sandra Hirche1
1Technical University of Munich, 2University of Siena

 

Abstract: Using vibrotactile wristbands for guiding visually impaired people or in human-robot interaction tasks requires establishing a method for analyzing the utility of the haptic device from perception to action; human tactile sensitivities are different across the body, age, and gender which affects the precision with which humans can localize the vibrotactile feedback and bias the spatial interpretation of the vibrotactile stimuli. This demonstration is related to the work-in-progress paper, which investigates appropriate mappings between the vibrotactile motor amplitudes and participant’s perceived direction. In this demonstration, the participants are guided towards a desired motion trajectory using a vibrotactile wristband. The vibrotactile interface, consisting of four motors, is worn on the wrist. The human motion is tracked using a markerless motion tracking system.

 

D1.06 Rice OpenWrist and CUFF Integration: Single DOF Haptic Feedback for Trajectory-tracking
Evan Pezent1, Simone Fani2, Joshua Bradley1, Manuel Catalano3, Matteo Bianchi2, Marcia O'Malley1
1Rice University, 2University of Pisa, 3Istituto Italiano di Tecnologia

 

Abstract: There is much interest in using haptic feedback for training new skills, or guiding human movement. However, results of studies that have incorporated haptic guidance to train new skills are mixed, depending on task complexity and the method by which the haptic guidance is implemented. Subjects show dependency on the guidance forces, or difficulty discerning which aspects of the haptic feedback are related to the task dynamics, and which are meant to convey task completion strategies. It is necessary to devise new methods to convey haptic cues for guidance separate from the haptic feedback of task dynamics to ensure effective skill transfer from the training environment to the real-world task. This demonstration features the integration of the Rice OpenWrist exoskeleton with the CUFF (Clenching Upper-limb Force Feedback) haptic feedback device. Subjects will attempt to track trajectories or locate targets in single- and multi-DOF workspaces while receiving real-time performance feedback via the CUFF. Visual and auditory feedback will be suppressed so only the mechano-tactile stimulation of normal and tangential skin forces from the CUFF can be used for task completion. The multi-modal haptic environment allows for separation of task and guidance dynamics promising to be an effective virtual environment training platform.

 

D1.07 Tactile Sensing Glove
Gereon Büscher 1,2, Robert Haschke 1,2
1Bielefeld University, 2CITEC

 

Abstract: We will demonstrate the latest version of our Tactile Data Glove, a human-wearable, tactile-sensing glove providing normal force/pressure measurements on about 60 taxels spread all over the palmar side of the glove. The sensor is composed of multiple layers of conductive fabrics including a piezo-resistive one. The sensor allows pressure measurements from a subtle touch (less than 1kPa) up to high pressures of more than 500kPa, which easily covers the common range for everyday human manual interactions. A 3D visualization is used to display data in an online fashion. The glove was successfully used in various human interaction studies, measuring interaction forces while grasping and manipulating objects, shaking hands with humans, or while performing a haptic search task. In all those experiments, the sensor provided invaluable insight into the force patterns that are driving those interactions.

 

D1.08 Demonstration of CLAP: A Soft-Hand Simulation Library for Haptics Applications
Mickeal Verschoor1, Daniel Lobo1, Miguel Otaduy1
1Universidad Rey Juan Carlos

 

Abstract: CLAP is a modular library which provides a simulation of a deformable hand. The library provides general interfaces for various hand-tracking solutions, third-party game engines like Unity or Unreal, and various haptic devices in order to provide the user with some form of haptic/tactile feedback. Within CLAP a deformable hand is simulated and controlled by the input of the user’s hand through some external hand-tracking method. Furthermore, CLAP simulates contact between the virtual hand and the virtual environment provided by the third-party game engine. Contact between the hand and the grasped or touched objects will generate forces inside the virtual hand and will deform its skin. These internal forces and deformations are then used to drive external devices for haptic and tactile feedback. Our main contribution with CLAP is to provide researchers a tool for integrating their haptic and tactile devices into a virtual environment. The major difficulty solved by CLAP is to maintain a stable simulation of the virtual environment, haptics simulation and the deformable hand. In this demonstration we show our hand simulation library in combination with some external hand-tracking and third party game engines. The main focus will be on tasks related to grasping, touching and placing objects in the virtual world using one’s own hands as input. There are no safety risks for the attendees and spectators.

 

D1.09 Mid-air Tactile Application Using Indirect Laser Radiation for Contour-Following Stimulation
Hojun Cha1, Hojin Lee1, Junsuk Park1, Hyung-Sik Kim2, Soon-Cheol Chung3, Seungmoon Choi1
1Pohang University of Science and Technology, 2Biomedical Engineering, 3Konkuk University

 

Abstract: This demonstration presents an indirect laser stimulation scenario, which is called LaserStroke, as an application of the same authors’ paper “Mid-air Tactile Display Using Indirect Laser Radiation for Contour-Following Stimulation and Assessment of Its Spatial Acuity.” This application uses sequences of laser shots radiated on an elastic medium at the palmar side of a latex glove worn in the user’s hand in order to elicit tactile sensations based on thermoelastic effects. The position of laser stimulation is controlled by a motorized gimbal. An earlier version of this demo was presented at UIST 2016, but this scenario has improved speed control on the basis of our new perceptual experiments about the spatial acuity of indirect laser stimulation. In the demo, one user, Sender, draws a contour on a screen using a mouse or a LeapMotion, or selects some predefined symbols, then another user, Receiver, gets tapping and stroking sensations following the contour drawn by the Sender. The left or right side of this demonstration setup needs to be blocked by a wall so that a laser shot should not be irradiated to unintended direction.

 

D1.10 The minimum size difference needed to discriminate two pliers
Qinqi Xu1, Gabriel Baud-Bovy1,2
1Istituto Italiano di Tecnologia, 2Vita-Salute San Raffaele University

 

Abstract: This demo showcases haptic skills needed to estimate the size of articulated objects. In particular, we tackle the practical problem of perceiving the size of large pliers through haptic interaction. It complements the study presented in paper #308. In the demo, the attendee will be blind-folded and stand in front of the table to perform a haptic discrimination task. During each trial, the attendee will successively manipulate two pliers held with a power grasp and report verbally which one of the two pliers was longer (two-alternative-forced-choice task). We plan to present six pairs to each person, which should take about 6 minutes. The individual performance (and threshold measured on accumulated results) will be shown immediately. In the aforementioned study, we found that the best performance was obtained when the pliers could be manipulated freely. We proposed that physical interaction in this condition can facilitate bimanual integration of the sensory cues that are collected by the two hands during haptic exploration and the construction of a mental representation of the pliers’ geometry and size.

 

D1.11 Feeling of Lateral Skin Deformation with Electro-Tactile Display
Hiroyuki Kajimoto1
1The University of Electro-Communications, Japan

 

Abstract: Lateral skin deformation, which is an important cue for friction and bumps, is commonly displayed by reproducing the deformation mechanically. Our research question is whether the mechanical deformation is indispensable, or spatial-temporal pattern of tactile stimulation can substitute for the deformation. Considering skin strain energy distribution, we speculated that control of the area size and the intensity of tactile stimulation with motion are necessary to reproduce this feeling. Experimental results with electro-tactile display partly supported this consideration.

 

D1.12 Wearable Pneumatic Wristband for Displaying Haptic Guidance Cues
Michael Raitor1, Matthew Gilbertson1, Allison Okamura1, Heather Culbertson1
1Stanford University

 

Abstract: In this demo, users will wear a pneumatic wristband to receive directional guidance cues. The wristband is composed of four separate pneumatic actuators that are placed on the top, bottom, left, and right of the wrist. The inflation of the actuators is pulsed to indicate one of six directions (up, down, left, right, clockwise, and counter-clockwise). The actuators are lightweight, flexible pouches manufactured from thermoplastic. The inelasticity of the actuators allows us to inflate them at low pressures with a small volume of air, which means that we can pulse them quickly (> 5 Hz). This pulsing actuation makes the guidance cue more noticeable than a sustained pressure cue. The system is actuated using a mechanical piston system to inflate and deflate the pouches.

 

D1.13 Pseudo Force Presentation to Multiple Fingers by Asymmetric Rotational Vibration Using a Motor: Consideration in Grasping Posture
Rei Sakuragi1, Vibol Yem1, Hiroyuki Kajimoto1
1The University of Electro-Communications, Japan

 

Abstract: It is known that a pseudo force sensation of pulling in one direction is generated by presenting an asymmetrical vibration stimulus with different accelerations in a round trip. The present study employed a similar phenomenon using the asymmetric rotational vibration of a direct-current motor to present the pseudo force sensation to multiple fingertips. We investigated the frequency characteristics of this phenomenon for two fingers (i.e., the thumb and index finger) in a grasping posture, showing that vibration at a frequency around 30 Hz is optimal. We experimentally found that the equivalent physical force that this illusion generates is 10 to 30 grams, with large variance among participants in the study.

 

D1.14 Energy-Efficient Vibrotactile Presentation for Mobile Devices Using Belt Winding
Takuto Nakamura1,2, Vibol Yem1, Hiroyuki Kajimoto1
1The University of Electro-Communications, Japan, 2JSPS Research Fellow

 

Abstract: To enrich experience or to assist operation in mobile devices, vibration feedback is popularly used. In general, vibration actuator vibrates the whole body of mobile device to transmit vibrotactile sensation. However, it requires high energy to produce strong vibration because the actuator needs to vibrate large mass of the mobile device. In this study, we proposed a method to transmit vibration sensation to a hand with less energy, by vibrating a lightweight belt made of PET film covering the body of the mobile device. We developed a prototype using two DC motors to vibrate the film belt. The prototype can present vibration with two modes; touching on the belt (on belt), and belt covering on the fingers (under belt). We measured electric power required for presenting subjectively the same strength vibration and compared with conventional technique. Result showed that our prototype consumes significantly less energy than the conventional technique, especially for under belt condition.

 

D1.15 IrukaTact: Submersible Haptics
Aisen Chacin1, Takeshi Oozu1, Hiroo Iwata1
1Tsukuba University

 

Abstract: IrukaTact is a wearable actuator for underwater haptic stimulation of the fingertips that provides a hybrid feel of vibration and pressure. The mechanism of this haptic module uses an impeller pump which suctions water from its surrounding environment and propels it onto the volar pad of the finger. One application for this submersible module is to be used as a haptic-sonar scanning probe. We have developed a new testing unit demonstrating the translation of distance data in air into haptic actuation underwater. This testing unit uses an ultrasonic range-finding handheld device to explore distances within the surrounding environment and a small water tank with a submersible thimble display where the user can feel pressure variations of the scanned area on their finger. Beyond this tool’s echo-haptic utility of transmitting a parallel sense of touch, these finger modules could potentially translate virtual object simulation underwater. Demonstrations of this testing unit present an opportunity to collect user feedback on the relationship between distance and haptic perception.

 

D1.16 A Pneu Transparent Shape Display for Touchscreens
Zhentao Xu1, Alex Russomanno1, Sile O'Modhrain1, Brent Gillespie1
1University of Michigan

 

Abstract: While touchscreens are becoming ubiquitous in vehicles, drivers must take their eyes off the road in order to make use of them. Many drivers miss the tactile feedback of traditional push-button interfaces that were intuitive to find and use without looking. Push-buttons interfaces also offer click-feel to confirm the registration of input back to the driver, reducing errors. Though touchscreens augmented with modulated traction forces or vibration can render certain haptic cues, these cues are not enough to support many of the types of user interaction available through push-buttons. In this demonstration, we implement a transparent pneumatic shape display on a touchscreen to render raised features and click-feel. Participants will use a driving simulator while interacting with the touchscreen device to simulate interacting with a dashboard interface. In one mode, the screen remains flat, acting as a traditional touchscreen interface, and in another, transparent physical features with click-feel are rendered on the touchscreen, providing localization and confirmation cues.

 

D1.17 UltraShiver: Lateral Force Feedback on a Bare Fingertip via Ultrasonic Oscillation and Electroadhesion
Heng Xu1, Michael A. Peshkin1, J. Edward Colgate1
1Northwestern University

 

Abstract: For nearly a decade, our group has developed devices that apply in-plane forces to the bare fingertip by combining lateral oscillation with variable friction. We have demonstrated high forces and silent operation, but never in the same device. Here for the first time we present a device, the UltraShiver, that can provide large lateral forces while operating in the ultrasonic regime. UltraShiver consists of a sheet of anodized aluminum excited in a compression-extension mode via piezoelectric actuators. By combining in-plane ultrasonic oscillation and out-of-plane Johnsen-Rahbek electroadhesion, both operating at about 30 kHz, lateral forces are generated. The lateral force is a result of friction being greater when electroadhesion is turned on than when it is turned off. The direction and magnitude of the lateral force can be adjusted by varying the phase between the in-plane oscillation and the electroadhesion. The UltraShiver is a simple and robust device that should serve as the basis for a wide variety of bare finger force feedback applications. Force feedback, however, requires integration with finger position sensing, which is part of our ongoing work. Please note: this system applies small amounts of current to the finger using a high voltage compliant current controller.

 

D1.18 Haptic design of a soft wearable exoskeleton for intuitive drone control
Carine Rognon1, Stefano Mintchev1, Dario Floreano1
1École polytechnique fédérale de Lausanne

 

Abstract: Piloting a professional drone with current interfaces, such as joysticks and remote controllers, require long training and constant cognitive effort during flight. Embodied interactions, that is the bidirectional link between the physical bodies and control systems of the robot and of the human, could not only enable a more intuitive control of drones, even for novices, but also provide users with an immersive sensation of flight, which is not rendered in current interfaces. Furthermore, a more intuitive control of drones could result in better flight accuracy and reduce training time in professional applications, such as firefighting, inspection, or search and rescue. At EPFL we are designing the FlyJacket, which is a soft wearable exoskeleton designed to provide an embodied interaction with a drone. Users can fly drones with simple torso movements recorded by embedded IMUs and receive visual, auditory and haptic feedback. This demonstration presents our first attempt to incorporate a cable-driven haptic feedback in the FlyJacket. This device gives kinesthetic feedback to the torso with a system of cables actuated independently by four motors. The feedback renders the dynamics of a simulated drone while maneuvering, as well as turbulences or wind gusts, in order to augment the flight control performance and immersive sensation. Acknowledgments: The authors thank F. Dell’Agnola and D. Atienza (ESL, EPFL) for the development of the IMU system of the FlyJacket; J. Miehlbradt, M. Coscia and S. Micera (TNE, EPFL) for the identification of fly gestures; and A. Cherpillod (LIS, EPFL) for the development of the drone controller and simulator. This work is supported by the Swiss National Science Foundation through the National Centre of Competence in Research Robotics (NCCR Robotics). This research has been approved by the EPFL Research Ethics Committee (HREC 020-2016).

 

D1.19 Paper-Based Tactile Display for Multiple Tactile Stimulation
Takuma Hirotsu1, Syoki Kitaguchi1, Kunihiro Kato 2, Homei Miyashita 2, Hiroyuki Kajimoto3, Hiroki Ishizuka1
1Kagawa University, 2Meiji University, 3The University of Electro-Communications, Japan

 

Abstract: This demonstration presents a paper-based tactile display which can present multiple tactile sensation. Most of tactile displays provide only single tactile stimulus. However, the tactile sensation we feel is the mixture of various tactile stimuli. Therefore, multiple tactile stimuli can nicely contribute to realistic tactile rendering. For this purpose, we propose a tactile display which can provide a user horizontal frictional force and vertical vibrational stimulus in stroking. Horizontal frictional force is generated by electrostatic force and vertical vibrational stimulus is generated by electrical stimulation. The intensity of the stimuli are controlled separately. This tactile display does not require complicated fabrication process such as etching and photolithography. The circuit patterns were printed with silver ink on a paper substrate and form electrostatic tactile display and electro tactile display in the assembled tactile display. In actual operation, users wear the tactile display on their fingertip. The tactile display connected to high-voltage power supply. However, current value is reduced to safe level(0.6mA). In preliminary experiment, subjects were able to discriminate among electrostatic, electrical, and multiple stimulation. In the demonstration, we will provide the multiple tactile stimulation to participants. Additionally, we will also try tactile rendering to real object made conductive and grounded.

 

D1.20 Localized Haptic-Feedback Large-Area Touch Screen Using Transparent Ferroelectric Polymer Film Actuators
Quang Van Duong1, Thinh Tam Luong1, Seung Tae Choi1, Fabrice Domingues Dos Santos2
1Chung-Ang University, 2Piezotech S.A.S. & Arkema

 

Abstract: In this study, transparent ferroelectric polymer film vibrators were designed and fabricated to provide the localized haptic-feedback on ten-inch touch screens. Poly(vinylidenefluoride-trifluoroethylene-chlorofluoroethylene) [P(VDF-TrFE-CFE)], known as relaxor ferroelectric polymer, was used for the vibrator material since it produces large strain and quick response under applied electric fields and has excellent optical transparency. When a human fingertip touches the touch screen and produces a contact between the ferroelectric polymer and bottom electrode, electric field is localized near the contact area and the vibration of the ferroelectric polymer film is amplified by the fretting vibration phenomenon. Therefore, the film vibrator can produce localized acoustic waves enough to cause a haptic sensation on the human fingertip.

 

D1.21 Presentation of Higher Frequency Vibrations on Surface Acoustic Wave Tactile Display
Masaya Takasaki1, Fumiki Sato1, Masayuki Hara1, Daisuke Yamaguchi1, Yuji Ishino1, Takeshi Mizuno1
1Saitama University

 

Abstract: Previously, a tactile display using surface acoustic wave which is a kind of ultrasonic vibration has been developed and can render tactile sensation like roughness. Basic principle is friction reduction induced by the wave and its control. Switching the wave at certain intervals results in vibrations on user’s finger surface. To consider realistic rendering, frequency range of the vibration to be presented on the display will be discussed. Aim of this research is to investigate vibration presentation with frequency more than 500 Hz. Amplitude modulation of excited ultrasonic wave was confirmed as frequency response of amplitude modulation. The modulation like 10 kHz was observed. Output vibration as tactile display was also observed by an experimental setup. Mechanical vibration up to 10 kHz was acquired, successfully. Frequency characteristics were also observed.

 

D1.22 MacaronMix: an online platform for vibrotactile interpolation
Benjamin Clark1, Oliver Schneider1,2, Karon MacLean1
1University of British Columbia, 2Hasso Plattner Institute

 

Abstract: Communicative vibrotactile (VT) icons are expected and valuable components of modern interfaces, yet composing meaningful icons remains difficult. Icons must be crafted by hand or built using time-consuming perceptual evaluations. End-users, researchers, and hapticians alike would benefit from the ability to create new VT icons by mixing existing ones. Such functionality could improve customization, experimentation, and design of new icons by allowing design-by-example, where vibrations are mixed as a visual artist mixes paint. MacaronMix is an online, open-source VT remixing tool that encapsulates and presents an interface for VT interpolation. Users can upload VT icons, then interpolate between them using the algorithm of their choice. We include two: a simple cross-fade between two icons, and a novel perceptually-informed rhythmic interpolation method. In both cases, icons can also be directly edited using our previously-presented vibrotactile editor, Macaron. End-users, researchers, and hapticians can used MacaronMix as a remixing tool directly, or use it as an open-source platform to develop new remixing techniques.

 

D1.23 Stable Haptic Feedback Generation during Mid air Interactions using Hidden Markov Model based Motion Synthesis
Dennis Babu 1, Hikaru Nagano 1, Masashi Konyo 1, Satoshi Tadokoro 1
1Tohoku University

 

Abstract: Generation of stable and realistic haptic feedback during mid-air gesture interactions has garnered significant research interests recently. But limitations of sensing technologies such as unstable tracking, non-uniform sampling duration, occlusions during interactions etc distort motion based haptic feedback. In this demo, we propose a motion synthesis model which tracks human gestures during interaction and recreates smooth and synchronized motion data from detected Hidden Markov Model (HMM) states. The proposed model uses ideal motion data of humans and duration of HMM states recognised during gestures to modulate the real-time motion synthesis to synchronize with actual human motion speed. The audience will experience the haptic feedback during the gesture interaction in a virtual world using a vibrotactile wrist band worn by the user. The entire experience will be having three different gesture tasks 1. zooming in and out 2. Tapping and 3. Two handed rotations, all of which are severely affected by occlusion.

 

D1.24 Texturing and Active haptic for automotive applications on Touchscreen
Stephanie Dabic1, Nour Eddine El-Ouardi1, Pedro Adriano1, Stephane Vanhelle1
1Valeo

 

Abstract: Previous studies showed that the use of haptic feedback can improve user’s experience. The demonstrator is a 7′ touch screen with a voice coil actuator called Vibeo. This actuator is constituted by a permanent magnet and a coil excited with pulsed alternative current. This voice coil is used in suspended mode fixed just behind the touch screen due to an aluminum frame. The architecture, called suspended mode allow vibrations in Z direction: top to bottom corresponding to direction of actuator’s vibrations). This demonstration show various type of vibortactiles feedback : haptic feedback for press and slide interaction with various physical characteristics. In the Slide interaction, one incremental haptic effect was generated with sinus shape, frequency 100 Hz, acceleration level from 2 to 6.5 Gees. In the Push interaction, different haptic effects were generated with different parameters: sinus shape, frequencies from 50Hz to 160Hz, acceleration level from 2 to 6.5 Gees, signals duration on the surface between 70 to 85 ms. Moreover we could produce various type of texturing sensation with this kind of vibrotactile feedback. To conclure, this mockup could measure pressure of one to threen finger in real time: two usefuls automotive use cases could be show with this measurement .

 

D1.25 6-DOF Haptic Rendering of Streaming Point Clouds
Maximilian Kaluschke1, Rene Weller1
1University of Bremen

 

Abstract: We present a novel 6-DOF haptic rendering algorithm for streaming point clouds and arbitrary 3D CAD objects. The core of our algorithm is our new volumetric penetration measure for point cloud surfaces. The main idea is to represent the CAD object’s volume by an inner bounding volume graph. In the first part of our algorithm, we identify intersection of the inner bounding volumes with the points and in a second part, we traverse the graph to find bounding volumes that completely passed through the point cloud surface. The point cloud does not require any additional data structures. We have implemented a massively-parallel version of our algorithm that runs completely on the GPU. We have tested our algorithm in several demanding scenarios both in theory and practice. Our results show that our algorithm is fast enough to be applied to 6-DOF haptic rendering while providing continuous forces and torques. Attendees will be offered the opportunity to touch live scenes captured by a Kinect with a haptic device.

 

D1.26 Worn Type Electrovibration Device For Haptic Interaction In Virtual Environments
Baekdong Cha1, Gregory Dunn1, Dongkue Kim1, Jeha Ryu1
1Gwangju Institute of Science and Technology

 

Abstract: When we touch an object, the friction between object and fingertip surface occurs and we feel a sensation of surface texture. In virtual environments, for an immersive experience, the display of surface texture at the fingertips is essential. So, we propose a device capable of displaying surface textures in the immersive virtual environment. The worn type Electrovibration Device provides tactile feedback using electro-vibration, thereby immersing users in a virtual environment. This can create haptic texture by passively touching a rotating surface. The device is composed of a ball shape electro-vibration rendering surface controlled by three actuators. The ball is rotated by a set of two controlling actuators, which are used to render slip. Another actuator raises and lowers the ball, simulating surface contact. Thereby it also can make encounter type display. The ball-shaped object used in this device is made of aluminum with surface anodic treatment and acts an electro-vibration surface to simulate varying friction for creating diverse haptic texture. Because this equipment is based on electro-vibration technology, a high voltage exceeding 200V is used. A current limit system (<0.7mA) is designed inside the control board, to ensure that the device operates safely. This work was supported by the National Research Foundation of Korea(NRF) Grant funded by the Korean Government(MSIP) (No.2011-0030079)

 

D1.27 How Saltational Animals Modulate the Emotional Dimensions of Saltation
Mounia Ziat1, Roope Raisamo2
1Northern Michigan University, 2University of Tampere

 

Abstract: Saltation or the cutaneous-rabbit effect (CRE) is a well-known tactile illusion where two or three taps delivered on the skin (e.g. a forearm) are perceived as distributed taps as if a small rabbit is hopping on the arm. In this demonstration, the attendees will have the opportunity to experience the emotional nature of CRE when presented with visual stimuli of saltational animals; i.e. animals that use hopping and jumping as a mean of locomotion. The purpose of the demonstration is to show that the pleasantness or averseness to a visual stimulus modulates the three emotional dimensions (valence: happy or sad, arousal: excited or calm, and dominance: in control or controlled) of tactile saltation.

 

D1.28 Temperature threshold to produce the illusion of wetness by changing contact area between a dry cloth and skin
Mai Shibahara 1, Katsunari Sato1
1Nara Women's University

 

Abstract: We found an illusion to perceive the wetness of a dry cloth by coldness. This illusion was induced by a small decrease in the skin temperature when the cloth was soft. This study demonstrates that especially the contact area between the skin and the object has an impact on the temperature to cause this illusion. Our demonstration used a silicon rubber to increase the contact area between the skin and the dry-cold cloth. We expected that the larger contact area is, the easier it is for the attendees to perceive an augmented sensation of wetness from the dry cloth.

 

D1.29 Local tactile feedback through time reversal focusing
Charles Hudin1, Harald Zophoniasson1, Christian Bolzmacher1, Moustapha Hafez1
1CEA, LIST, LISA

 

Abstract: The demo will give the opportunity for attendees to evaluate a focused wave approach for localized tactile feedback on a thin glass plate put on top of a tablet screen. With this approach, flexural waves are focused to create an impulsive displacement at any chosen point of the surface. Multiple points can be created and repeated to create complex spatial and temporal pattern on the transparent surface. The tablet screen underneath the surface simultaneously displays information that the attendee can touch. The technology was introduced in a paper published in 2015 in IEEE Transactions on Haptics under the name “Localized Tactile Feedback on a Transparent Surface through Time-Reversal Wave Focusing”, but this would be the first public demonstration. We believe that the interaction possibilities offered by this technology would be of interest for a large audience. The maximum voltage delivered to the actuators is 60V but the electronics, wires and actuators will be out of reach for the attendees.

 

D1.30 FerroFluid based lightweight and portable tactile haptic display for displaying persuasive haptic cues of material and geometric perception
Harsimran Singh1, Syed Zain Mehdi1, Jee-Hwan Ryu1
1Korea University of Technology and Education

 

Abstract: We proposed a ferro-fluid based tactile display, which is lightweight and portable, but can replicate convincing contact orientation and texture information. Numerous studies have been conducted to develop tactile displays for providing convincing tactile feedback. However, most of the displays were limited in portability and restricted to delivering either texture information with vibrational cues or contact orientation with force feedback. To the best of our knowledge, the proposed tactile display is the first wearable tactile display which can deliver texture information together with contact orientation, and still be lightweight and portable. Introducing ferro-fluid allows minimizing the moving actuator components and also eliminates any counter reactive force on the fingernail of the user which may distort the tactile sensation otherwise. The demo will showcase ferro-fluid based tactile display for curvature discrimination and texture cues.

 

D1.31 Shape and texture rendering of an image on a touchscreen
Thomas Hausberger1, Florian Enneking1, Michael Terzer1, Zofia Gabriela Jonas1, Yeongmi Kim1
1MCI, University of Applied Sciences

 

Abstract: Although a few touchscreen devices provide simple haptic feedback, to confirm the user’s input or present a state of user interaction, haptic feedback of visual content on a touchscreen has not greatly been dealt upon. This demo presents a novel shape and texture rendering device comprised of a 3DoF motion platform, tablet with a vibrating glass surface. The shape rendering of an image is implemented by the 3DoF motion platform located beneath the touchscreen tablet with the depth image. Piezo elements located on the glass surface trigger mechanical vibration resulting in a frictional variation when sliding a fingertip over the textured image surface. Two images will be presented in the demo session; (1) a top view of an inca pyramid and (2) a male face with rough beard.

 

D1.32 Experiencing the Influence of Virtual Rigid Object on R-V Dynamics Illusion
Taiki Yamada1, Satoshi Hashiguchi1, Fumihisa Shibata1, Asako Kimura1
1Ritsumeikan University

 

Abstract: “R-V Dynamics Illusion” is the illusion caused by different motional states of real object (R) and the virtual (V) object in the mixed reality space. In previous studies, we found that a real rigid body was lightly perceived when images of virtual liquid which is shaking in a virtual box were superimposed on it. As a next step, in this study, we confirm whether the similar illusion occurs if the virtual liquid is changed to a virtual sphere object (rigid body). In this case, since the virtual sphere object and the virtual box are rigid and they collide each other, the impression of the sound and the tactile sense could be more important. However, as the sphere and the box are virtual, there is no sound or tactile feedback, even if they collide, unless sound or tactile feedback is intentionally added. Here it is unclear what happens in the absence of auditory and tactile feedback, or if some sort of auditory and/or tactile feedback is applied. In this demonstration, you can experience the effect of R-V Dynamics Illusion when the image of virtual sphere moving inside the virtual box is superimposed on the real rigid body. In addition, you can experience what happens if we add or do not add sound and tactile feedback to it.

 

D1.33 Camera-based Tactile Sensors
Nicolas Alt1, Clemens Schuwerk1, Eckehard Steinbach1
1Technical University of Munich

 

Abstract: Autonomous robots require sophisticated perception systems to manipulate objects. In this context, cameras denote a rich source of information and are also precise and cheap. Our novel sensor approach uses cameras together with robust image processing algorithms to measure forces and other haptic data. The sensor system bases on flexible rubber foam without any internal electronics, which is attached on the fingers of a gripper. This flexible material shows a characteristic deformation during contact. The camera observes the foam and image processing algorithms detect the foam’s deformation from the camera images. Finally, our sensor software calculates haptic contact data during the grasp procedure. The demo shows the capabilities of this so-called visuo-haptic sensor in a sorting task. A linear robot with a two-finger gripper sorts plastic bottles based on their compliance. During the grasp procedure, the visuo-haptic sensor measures the contact forces, the pressure distribution along the gripper finger, finger position, the object deformation, and also estimates the object compliance as well as object properties like, e.g., shape and size. The novelty lies in the fact that all this data is extracted from camera images, i.e., the robot is “feeling by seeing”.

 

D1.34 Reproduction of Textures based on Electrovibration - A Frequency Domain Approach
Tamara Fiedler1, Yasemin Vardar2, Matti Strese1, Eckehard Steinbach1, Cagatay Basdogan2
1Technical University of Munich, 2Koç University

 

Abstract: This demonstration presents a novel approach to display textures via electrovibration. We collect acceleration data from real textures using a sensorized tool-tip during controlled scans and captured images of the textures. We display these acceleration data using a common electrovibration setup. If a single sine wave is displayed, it can be observed that spectral shifts occur. This effect originates from the electrostatic force between the finger pad and the touchscreen. According to our previous observations, when multiple sine waves are displayed interferences occur and acceleration signals from real textures may not feel perceptually realistic. We propose to display only the dominant frequencies from the real texture signal, considering the JND of frequencies, to mitigate this observation. During the demo session, we will let the attendees feel the differences between previously recorded texture signals and our proposed frequency-reduced texture signals. Moreover, the effect of different amounts of dominant frequencies will be shown to the attendees.

 

D1.35 Virtual Two-Finger Haptic Manipulation Method
Alfonso Balandra1, Virgilio Gruppelaar2, Hironori Mitake1, Shoichi Hasegawa1
1Tokyo Institute of Technology, 2Delft University of Technology

 

Abstract: We want to show a novel and simple method to enable two finger grasping. This method propose to use a conventional 6-DOF haptic interface and a surface force sensor to measure the user grasp on the haptic pointer. This force is then introduced into a virtual coupling system that joins two haptic pointers. By these means, the user is able to simultaneously control the orientation and the distance between the haptic pointers. So if an object is placed in between the user can adjust his grasping force to grab and manipulate a virtual object. Contrary to other proposals, we avoid using extra actuators or special mechanisms to enable grasping on a 6-DOF haptic interface. Instead, we proposed to use a force sensor to control the lineal movement between two haptic pointers, to simulate the extra degree of freedom needed for grasping. We are aware that this configuration is asymmetrical, but the performance evaluation results show that the user’s manipulation performance was not significantly affected by the lack of grasping feedback

 

D1.36 A Digital Book Augmented with Tactile Feedback
Frederic Giraud1, Patricia Plénacoste1, Laurent Grisoni1,2, Christophe Giraud-Audine3, Michel Amberg1, Betty Semail1
1University Lille1, 2INRIA Lille, 3Arts & Metiers Paristech

 

Abstract: This demonstration presents the first Digital Book Augmented with Tactile Feedback. It is Based on e-Vita, a device which looks like a Tablet, and which embed a friction reduction Tactile Display. The scenario and the drawings of the book were realized by Dominique Maes, an art designer from belgium, which has already developped his own DigitalBook (« Bleu de toi », AppleStore). With e-Vita, the artist could express his poetry through visual and auditory feedback, and also with tactile feedback : readers of the book can touch, feel and hear animals appearing on the screen. A specific software has been made in order to include the tactile rendering into standard application ; in our example, the haptic book is made in HTML5 langage. The book has been presented to readers during an official special event at the public library of Lille (France). More than 30 people, with age ranging from 5 to 70, male and female, experienced the book. Because we programmed a data logger, we can record the trajectory of user’s fingers on the display, and analyze the gesture in order to improve the settings of the tactile feedback. Psychophysical results can thus be obtained from this book.

 

D1.37 Proton Pack: Visuo-Haptic Surface Data Recording
Alex Burka1, Katherine Kuchenbecker1,2
1University of Pennsylvania, 2Max Planck Institute for Intelligent Systems

 

Abstract: The Proton Pack is a visuo-haptic recording instrument designed for data collection in the field. It is portable and self-contained, allowing a human operator to interact with surfaces through various end-effectors while recording surface appearance, 3D shape, and haptic characteristics of the interaction such as normal force, tangential force, vibration, and sound. The sensing rig is handheld, connected to a backpack containing a battery and a computer that records data. A web-based smartphone interface allows for easy control of the process. The paper that we are presenting at this conference explores the impact of scan-time parameters (tool speed and normal force, which the Proton Pack is designed to measure) on a haptic surface classification task. For this demo, the Proton Pack will be configured in a live data-display mode, rather than continuous collection, so attendees can see the motion tracking, contact forces, and images in real time.

 

D1.38 There You Are: Using Thermal Feedback On Forehead As Directional Cues
Wei Peng1, Roshan Peiris1, Zikun Chen1, Kouta Minamizawa1
1Keio University

 

Abstract: In this proposal, the scientific novelty is the utilization of thermal feedback on the forehead for spatial awareness. The system is going to be used in VR environment. In the system, three thermal modules share a certain covered angle in horizontal dimension. As long as the target object gets into the module’s covered area, the corresponding module turns activated and provide user with a thermal stimulus. Thus when user turns his head, these stimuli can be perceived as spatial(directional) cues which can lead user to the object’s direction and figure it out. During the demonstration, we would switch the module’s covered angle to 3, 15 and 45 degrees and conduct the demonstration with cold and hot stimuli.Thus 6 demo in all for one attendee. At the end of the demonstration, attendee can grade the performance of each cuing method by themselves. Because of the individual difference (some people are more sensitive to the temperature changing while others are not ), we would adjust the intensity of stimuli before each demonstration.

Demonstrations 2

D2.39 Consideration about Haptics in the production of the Traditional Japanese Painting
Hiromi Nakamura1,2
1The Japan Art Institute, 2National Research and Development Organization, Japan

 

Abstract: Both nerve structure and Haptic have three physical characteristics: power, vibration, and temperature, and they function similarly as Sight. I would argue from artistic perspective, that Sight by dark color could replace Haptic by particles, and that they equilibrate each other.

For example, I attempted to prove the above hypothesis through my Japanese painting work named “INORI” (“prayer” in Japanese). Applying the Weber’s law, the tactile difference threshold and the color one have been integrated; the part of light is expressed by Haptic and the part of dark color by Sight.

Furthermore, the traditional Japanese painting has developed sophisticated technologies of using natural minerals, such as malachite and gold for coloring. Each pigment has 15 color ranges depending on the size of particles. Pigments are placed on a dish to be mixed with glue, held on a painter’s palm to heat at the human body temperature.

It is said that the texture of Japanese painting gives vibration at number from ten to 100 Hertz and give the coarseness number of from one to ten Nano micron, which is the same as the sense of touch to make human feel comfortable.

Haptic display as well as Sight in three primary color principles through Japanese paintings could make a benefit to the human beings. It would be crucial to empirically analyze the unification of Haptic and Sight, to contribute to the industry of universal user-friendly products.

 

D2.40 Aito Haptic Touch – interact with objects in virtual reality
Turo Keski-Jaskari1, Jari Toropainen1, Billy Pitiot1, Jockum Lönnberg1, Stefan Heijboer1
1Aito Touch

 

Abstract: Aito’s Haptic Touch technology is well-known for its precise, localized ‘button-like’ feedback for user interface controls. In this demo, we show how the capabilities of this unique piezo-based technology can be leveraged for next generation 3D interactions with a cost-effective structure that is readily applicable to mass production. Using a cube shaped object with multiple piezo discs inside, the sensation of grabbing, squeezing and manipulating a 3D shaped virtual object can be experienced in real life. With a combination of force sensing and precise haptic actuation, the physical sensation of holding the cube in your hands can for example be varied so that it feels as if you are manipulating a soft object. The functionality is based on the intelligence in the AitoChip that can sense and directly drive one or more piezos individually or together, with the ability to control and alter key parameters on the fly. Based on proven technology and low cost piezo discs, this demonstrator highlights the potential of Aito’s technology for virtual and augmented reality, gaming or medical applications.

 

D2.41 Aperture feedback for the Pisa/IIT SoftHand with the Rice Haptic Rocker
Janelle Clark1, Edoardo Battaglia2, Matteo Bianchi2, Manuel Catalano3, Antonio Bicchi2,3, Marcia O'Malley1
1Rice University, 2University of Pisa, 3Istituto Italiano di Tecnologia

 

Abstract: This demonstration allows attendees to experience skin stretch as haptic feedback in conjunction with a prosthetic hand to handle and complete tasks with everyday objects. The closure position of the Pisa/IIT SoftHand is indicated to the user through the Rice Haptic Rocker position on the upper arm. Myoelectric prostheses have seen increased application in clinical practice and research, due to their potential for good functionality and versatility. Yet, myoelectric prostheses still suffer from a lack of intuitive control and haptic feedback, which can frustrate users and lead to abandonment. To address this problem, we propose to convey proprioceptive information for a prosthetic hand with skin stretch using the Rice Haptic Rocker. Although the demonstration will be completed with the handle version of the Pisa/IIT SoftHand, in our experiments the rocker was integrated with the myo-controlled version. A size discrimination test with 18 able bodied subjects was performed to evaluate the effectiveness of the proposed approach. Results show that the Rice Haptic Rocker can be successfully used to convey proprioceptive information. For this demonstration attendees will be able to play with the training materials to learn to use the hand as well as experience haptic feedback.

 

D2.42 Robotic Hand Telemanipulation: Kinfinity-Glove and Spacehand
Dr. Maxime Chalon1, Maximilian Maier1
1Institute of Robotics and Mechatronics

 

Abstract: The Kinfinity Glove is a new generation of multi-modal input device for use in virtual reality, robotics, gaming and many more. It combines joint position sensing and finger touch sensing in a sensor fusion algorithm in order to improve the rendering quality. It allows user to manipulate accurately objects with a robotic hand without the need for a finger level feedback that cannot always be practically realized. It is used to manipulate objects of any size in virtual reality, operate connected machines / robots, enhance and accelerate designing processes or to train challenging finger motions, e.g. during surgery or fabrication practice. The main contribution is the algorithm used to improve the matching between the user input and the robotic system signals; thus minimizing the user fatigue. It uses a proprietary sensor technology to achieve high accuracy and adaptability. It is used as a telemanipulation input device for the DLR haptic interface as well as for the Spacehand, a dexterous robotic hand designed for the RSGS mission. The demonstration will allow the users to manipulate the Spacehand robot with the glove as well as use the glove in the virtual reality scene (3D viewer). The glove working principle, combined with the processing algorithm is the novelty of the demonstration.

 

D2.43 High-definition surface-haptics keyboard
Nicolas Huloux1, Jocelyn Monnoyer1,2, Julien Diperi1, Marc Boyron1, Michael Wiertlewski1
1Aix-Marseille Universitée, CNRS, 2PSA Peugeot Citroën

 

Abstract: Surface haptics devices have proven effective to deliver complex and rich tactile directly to the user’s bare finger interacting with a flat plate. This family of interfaces has the potential to significantly improve our interaction with machines. One leading technology uses ultrasonic levitation in order to modulate the friction force that the finger experiences when sliding over a glass plate. Although effective, the actual force that the user receives is not homogeneous due to the presence of nodal lines, does not covers the entire frequency spectrum because of the dynamic of the plate and suffers from large variability owing to the complex nature of skin tribology. In this demonstration, we present an ultrasonic friction reduction device empowered with closed-loop control of the friction force felt by the finger. The regulated friction force produces high-fidelity tactile stimuli thanks to low noise force sensing scheme. The interface is equipped with optical finger position sensing which resolve 20µm increments and will enable a 5kHz refresh rate of the friction modulation. Participants will experience the rendering of a virtual keyboard using large, 10″ high-definition surface haptic interface.

 

D2.44 The underactuated hand exoskeleton for active grasping assistance
Mine Sarac1, Daniele Leonardis1, Massimiliano Solazzi1, Antonio Frisoli1
1Scuola Superiore Sant'Anna - Percro

 

Abstract: In this demonstration, we present an underactuated hand exoskeleton to assist users performing grasping tasks with improved force transmission and adjustability for the size and shape of the grasping object. The mechanism is completed only when is worn by the user, which provides adaptability for different finger sizes. The device is connected to the first and the second finger phalanges to rotate MCP and PIP joints of each finger. The mechanical joints do not have to be aligned with the finger joints to ensure safe operation for the user. Only directional forces are applied to the finger phalanges to significantly improve the functionality of the fasteners to connect the exoskeleton to the fingers and the wearability of the device. The underactuation adjusts the finger components of the exoskeleton automatically while assisting the user to grasp different objects using a single actuator. Each finger component is placed above the finger to free the palm of the hand for real grasping tasks. In order to provide significant output force, screw-driven DC motors have been utilized to control each finger component independently. In order to overcome the non-backdriveability, force sensor based (single 1 DoF strain gauges) have been implemented for each finger component. In this demonstration the participant will be able to wear the hand exoskeleton and to experience different control modalities, ranging from active assistance in hand grasping and opening, to the rendering of virtual graspable of objects with different stiffness.

 

D2.45 Interacting with the virtual reality: rendering of pressure, textures, and making/break contact sensations via fingertip wearable haptic devices
Guido Gioioso1, Giovanni Spagnoletti1, Leonardo Meli1, Tommaso Lisini Baldi1, Claudio Pacchierotti 2, Domenico Prattichizzo1,3
1University of Siena, 2CNRS, 3Istituto Italiano di Tecnologia

 

Abstract: Head-mounted displays, such as the Oculus Rift, and unobstrusive tracking systems, such as the Leap Motion, are making Virtual Reality (VR) experiences increasingly immersive and convincing. However, users are still not able to touch virtual objects – severely limiting the immersiveness of VR systems. This work presents a novel wearable haptic system for immersive virtual reality interactive experiences. It consists of a Leap Motion controller, in charge of tracking the motion of the fingertips, an Oculus Rift headset, in charge of providing a first-person view of the environment, and three wearable fingertip tactile devices, in charge of providing the sensation of touching the virtual objects. Each wearable device is composed of two platforms: one placed on the nail side of the finger and one in contact with the finger pulp, connected by three cables. One small servomotor controls the length of the cables, thus being able move the platform towards or away from the fingertip. One voice coil actuator on the platform provides vibrotactile stimuli. The devices can therefore provide different levels of pressure to the fingertip, the sensation of making/breaking contact with virtual objects, and the rendering of textures.

 

D2.46 Tactile stimulation of the entire hand
Basil Duvernoy1, Vincent Hayward1,2, Sven Topp3
1Institut des syteme intelligents et de robotique, 2University of London, 3University of Sydney

 

Abstract: Human is social by its nature. So when it comes to people who cumulate deaf and blind handicaps, daily life becomes much more complexe. The main goal of this project is to overcome this deficiency of informations by creating an apparatus of haptic stimulation which is capable to translate one of their tactile langage, Lorm.

In order to recreate the patterns of the deafblind tactile langage, the apparatus uses twenty four electromagnet actuators. The permanent magnetic rod moves inside the coil, thanks to the magnetic field induced by the coil, up and down in order to create two different stimulis. These stimulations are normal to the skin. The first movement is a spontaneous contact which is comparable to a ‘tap’. The second is a vibration which can go up to 1000Hz in order to create different types of illusion.

The apparatus is composed by three parts. The first one is the interface with the user. It is like a big computer mouse where the actuators go in contact to the skin. The principle of the second part is to translate the letters and numbers into waveforms for actuators. The last one transforms these digital waveforms into analogic signals thanks to a DAC and provides the needed current for each actuator.

 

D2.47 Teaching abstract concepts with haptics
Nicolò Balzarotti1, Gabriel Baud-Bovy1
1Istituto Italiano di Tecnologia

 

Abstract: The WeDraw European project is aimed at teaching abstract mathematics and geometric concepts to 6 to 10 years old children by using multiple sensory modalities like vision, hearing and haptics. As part of this project, we developed an add-on for the SensAble Omni Haptic device, 3 dof force feedback device. The add-on is mounted on the part of the device that supports the stylus, and includes two mono speakers and two high-brightness LEDs to provide additional feedback to the user.The add-on brings co-located feedback about the virtual 3D environment which is missing which is missing when classical feedback devices, like a computer monitor or tablet are used to provide visual and audio feedback. Co-located feedback should facilitate multisensory integration and interaction with the virtual environment. Various proof-of-concept applications have been developed: the aim of those is to provide an interactive environment in which the user is encouraged to learn abstract physical and mathematical concepts. In the demonstration, we will show this add-on and its use in educational applications.

 

D2.48 Masking of Electrical Vibration Sensation Using Mechanical Vibration for Presentation of Pressure Sensation
Vibol Yem1, Hiroyuki Kajimoto1
1The University of Electro-Communications, Japan

 

Abstract: In our previous study we found that, electrical stimulation produces both vibration and pressure sensations at the same time, and it is difficult to control these sensations individually. It is known, however, that the mechanical vibration sensation on an index finger can be masked using another mechanical vibration at the same frequency that is presented to the forearm. In our present study we confirmed that, this phenomenon can be used to mask the electrical vibration sensation in an index finger and retain the electrical pressure sensation only by applying mechanical vibration to the palm. Our experiment also showed the increasing of both pressure sensation intensity and acceleration amplitude of mechanical vibration when the intensity of electrical current increased. In the demo experience, we present mechanical vibration to the palm and electrical stimulation to an index finger of the same hand of the participants. Participants can adjust the intensity of electrical current and the amplitude of mechanical vibration. We will ask the participants to confirm the decreasing of electrical vibration sensation and whether pressure-like sensation remains or not when the amplitude of mechanical vibration presented to palm is increased.

 

D2.49 Vibrotactile Stimulation for a Car-based Motion Platform
Hirotaka Shionoiri1, Rei Sakuragi1, Ryo Kodama1, Ryuta Okazaki1, Hiroyuki Kajimoto1
1The University of Electro-Communications, Japan

 

Abstract: To reduce the cost of motion platforms (MPs), Virtual Reality (VR) systems with automobiles as an MP have been proposed. This produces whole-body motion by controlling the car with an accelerator and a brake. However, the system can only present body motion of up to 2 Hz, which is relatively small compared with commercially available MPs. In this paper, we propose a method of expanding the frequency range by using vibrators around the driver’s seat. We developed an experimental device that imitates a car, but body motion was generated via the human hand. We compared three conditions: swing only, vibration only, and both stimuli, experienced alongside VR content. The results suggest that presenting vibration and body swing in combination significantly increases the feeling of presence, and decreases the feeling of unnaturalness.

 

D2.50 Haplink: A low cost, open source, two-degree-of-freedom, 3-D printed haptic kit
Melisa Orta Martinez1, Michal Rittikaidachar1, Allison Okamura1
1Stanford University

 

Abstract: This demonstration showcases Haplink: a low cost, open source, two-degree-of-freedom, 3-D printed haptic kit designed for educational applications. Haplink was designed as a customization project of Hapkit and it can be manufactured and assembled by students. Haplink uses a novel serial drive mechanism, which maintains both motors grounded. Users will manipulate Haplink to feel different virtual environments such as inside and outside boxes of different stiffness and materials as well as interact with graphical representations of those objects on a computer screen. Attendees will also be able to handle and assemble Hapkit in order to understand its evolution into a 2-DOF device. Both Hapkit’s and Haplink’s open source designs are available at http://hapkit.stanford.edu.

 

D2.51 Haptic Vest for Multimodal Immersion in Virtual Environments
Gonzalo García-Valle1, Jose Breñosa1, Manuel Ferre1
1Centre for Automation and Robotics (CAR) UPM - CSIC

 

Abstract: This demonstration presents a haptic vest for interaction with virtual environments; though the original objective is to use it for training of security forces. Currently, the vest has two kinds of actuators allowing to create a multimodal interaction with the environment, providing an increase of realism and immersion when users are wearing the whole virtual reality system. Vibrotactile actuators allow to reproduce sensations as contacts with other characters or with any element of the virtual environment. Furthermore, thermoelectric actuators allow to feel temperature variations that virtual character is experiencing during the game like the increasing temperature when approaching to a hot spot. All actuators are distributed according to two-point resolution distances that have been obtained through previous physiological studies. Those studies were performed in all areas where actuators are placed over the vest. Therefore, this vibrotactile and thermal distribution reproduces a more reliable and realistic sensation respect of haptic patterns created with a more random or unstructured distribution. Moreover, the vest design highlights the lightness and its all-size adaptability for users. The attendees can wear the vest with the rest of the virtual reality system, controlling their virtual character while they are feeling all tactile interactions between the virtual character and the virtual environment. Additionally, two power supply adapters are needed for the proper operation of the haptic vest in order to power thermoelectric actuators. These regulated transformers are connected from a 230 V outlet until two 5V connectors of the haptic vest.

 

D2.52 Reproduction of Finger Surface Behavior on One-dimensional Textured Surface
Seitaro Kaneko1, Hiroyuki Kajimoto1
1The University of Electro-Communications, Japan

 

Abstract: Relationship between skin displacement and subjective sensation is indispensable for the design of tactile feeling display. Previously, We developed a system that can observe interaction between textured surface and finger skin by using technique known as index matching. And we recorded skin deformation on the one-dimensional textured surface using subjects. In this demo, we represent the recorded fingertip movement with latero produced by tactilelab. it is possible to generate recorded skin deformation at high density. The vibration to be reproduced this time is a record of one of the object finger skin.

 

D2.53 VR Touch
Eric Vezzoli1, Thomas Sednaoui1, Zlatko Vidrih1, Vincenzo Giamundo1
1Go Touch VR

 

Abstract: The interaction with the virtual reality is currently mostly performed by joysticks, they are an effective way to interact with VR, but do not provide a natural interaction environment for the user. To effectively provide credible hand interaction in VR three key elements are necessary: a reliable hand tracking and rendering system, a lightweight touch feedback, and a VR environment responding to the controls in a physically credible manner. In our demonstration, we show the integration of our VR Touch solution with the Leap Motion hand tracking, and a Unity developed physical environment.

VR Touch is a wearable ring developed as a commercially viable solution for the hand interaction with virtual objects featuring:

  • 1DOF haptic reproduction (touch, buttons, vibrations, textures)
  • Wireless connection
  • 2 h battery life
  • Scalable concept (1-10 fingers)
  • 20 g of weight

Several VR hand interaction scenarios will be proposed like:

  • Button interaction
  • Small object manipulation
  • Elastic object interaction
  • Music playing
  • Gaming

During the demonstration, the natural interaction strategies, and the increased control, dexterity, and immersion compared with a simple vibrational and a null feedback, will be shown.

 

D2.54 Onomatopoeic-based Classification of Generated Sensation on Electrostatic Tactile Display
Hirobumi Tomita1, Satoshi Saga1, Hiroyuki Kajimoto1,2
1University of Tsukuba, 2The University of Electro-Communications, Japan

 

Abstract: The touchscreen interfaces have become increasingly popular worldwide. In recent years, to achieve a rich haptic experience, several electrostatic tactile displays have been developed. In this research, we focused on evaluating how the user feels about the electrostatic stimulation. By employing onomatopoeic words in Japanese, we classify the displayed tactile sensation intuitively. It is a convenient way to express tactile sensations in everyday life. In previous research, Watanabe, et al. mapped a distribution diagram of tactile-related onomatopoeic words based on subjective impressions in two-dimensionally visualized form. By connecting the electrostatic stimulation on the map, we can design other electrostatic stimulation. To connect the links, we held a psychophysical experiment and investigated which onomatopoeic word were suitable for each electrostatic stimulation. In this demonstration, we will provide an experience that the user can feel the electrostatic tactile sensation based on the chosen onomatopoeic word from the provided word list in Japanese. Our device provides high voltage on insulated electrodes up to 600 V, however, the current is safely limited up to 2 mA by the electro circuit.

 

D2.55 Haptic Rules! Augmenting the gaming experience in traditional games: The case of Foosball
Elia Gatti1, Dario Pittera1, Jose Luis Berna Moya1, Marianna Obrist1
1University of Sussex

 

Abstract: We present a haptically augmented version of the classic foosball game. In our demo, we showcase a revisited foosball table, which creates friction on users’ rods, enhancing the engagement and creating stress-like feelings. In particular, four colored sensor plates are mounted in the four corners of the foosball table’s field. The red plates, once hit, will actuate four motors (controlled by an Arduino microcontroller) positioned at the top of the opponents rods, which will increase the friction on these ones. The green plates will be used by the blocked players to release their own rods and allow a friction-free game. In the present paper, we modified the traditional game of foosball by introducing what we called haptic rules, simple technological augmentations enabling users to control the experienced haptic feedback arising from the physical interaction with the game. Results from our user study showed that the haptic rules enhanced and improved the users’ experience with the game, an effect that was proven to be stable across time. We believe that this demo will stimulate interesting discussions for future works aimed to re-think at classic games in a modern tactile way.

 

D2.56 Foldaway: a pocket-size, foldable, haptic interface
Marco Salerno1, Stefano Mintchev1, Jamie Paik1
1École polytechnique fédérale de Lausanne

 

Abstract: In a world where machines and electronic devices are becoming ubiquitous and portable, the quest for low-cost and ultra-portable haptic interfaces is exponentially growing. However, the market is currently populated either by bulky and expensive interfaces that render forces with high accuracy, either by simple devices that exploit vibrations to render a limited number of sensations. FOLDAWAY haptic device is an ultra-portable and low cost origami haptic interface that innovates the field of haptics. The device has three degrees of freedom and can interact with human fingers by tracking their motion and providing force, stiffness and texture perception. Through its unique origami manufacturing, it is the first interface of its kind that – folds-away – collapsing into a plane when not in use. Its palm sized low-profile design makes it the ideal user interface for any portable device. The quasi-2D origami design and manufacturing of FOLDAWAY renders a device that is cost competitive, scalable and does not require any manual assembly. The proposed demonstration will show the effectiveness of the embedded custom electromagnetic actuation system integrated into the origami manufacturing. The actuation system is used to actively render haptic feedback to user’s fingertip. The FOLDAWAY haptic device will be demonstrated in two platforms a) a PC mouse with haptic feedback, and b) a platform for digital drawing providing drawing tools perception.

 

D2.57 Interactive SPA Skin
Harshal Sonar1, Sagar Joshi1, Matthew Robertson1, Jamie Paik1
1École polytechnique fédérale de Lausanne

 

Abstract: We propose a demonstration of a soft and stretchable wearable device that will be placed directly on the skin to provide a modulable and high fidelity haptic feedback: a soft pneumatic actuator (SPA) based device with integrated tactile sensors. The compliant SPA-skin is capable of producing high frequency (0–40Hz) vibrations with modulable amplitude (4mm). Along with the embedded piezo-ceramic sensors, the quasi-2D form and low-profile, the soft SPA-skin provides an ideal “hardware” interface for high fidelity haptic feedback backed with sensing and actuator closed-loop control. The prototype measures 60x60x1.5 mm3, consists of 16 independently actuated and controlled elements arranged in a 4×4 matrix. Each element – “mechano-pixel” – consists of an actuation layer and a sensing layer. The actuation layer is constructed by creating a pneumatic channel between two adjacent silicone layers. The sensing layer consists of a piezoceramic(PZT) element that produces electrical charges proportional to the contact force. Since the layers are fabricated separately and later integrated together, they form a modular, highly customizable and failsafe design. We will show two demonstrations, a) producing kinetic and vibrotactile feedback using a single mechano-pixel by modulating the blocked force (upto 1 N with resolution of 0.05N) and frequency (over 40 Hz with resolution of 0.1Hz); and b) demonstrating multimodal information transfer between wearer and computer. Different modes such as the location, shape, pattern, vibration, etc. can be conveyed to wearer. Furthermore, the device can also be used to read input from the user.

 

D2.58 Modelling of Anal Sphincter Tone based on Pneumatic and Cable-driven Mechanisms
Luc Maréchal1, Alejandro Granados1, Lilian Ethapemi2, Shengyang Qiu1, Christos Kontovounisios1, Christine Norton3, Fernando Bello1,
1Imperial College London, 2Central London Community Healthcare NHS Trust,3King’s College London

 

Abstract: Our current haptics-based simulator prototype has been presented and demonstrated at a number of international conferences and public engagement events proving to be very popular, raising considerable interest amongst attendees due to both its relevance for clinical practice, as well as its technological innovation. The audience will initially be briefed of the clinical context of Digital Rectal Examinations and the challenges associated with the teaching and learning of unsighted examinations. They will then be able to palpate internal anatomy through haptics, as well as assess healthy and abnormal sphincter tone cases through our proposed pneumatic and servomotor mechanisms. Our approach is novel given that current training benchtop models are static having no means to characterise the dynamic function of the sphincter tone. Compared to what was demonstrated at AsiaHaptics’16, a motor-encoder mechanism, we will demonstrate our proposed pneumatic and a servomotor approach based on anorectal manometry, and give attendees an opportunity to ‘feel’ healthy and abnormal cases that have been evaluated quantitatively and qualitatively. Only one person can participate at a time, although the initial explanation and the see-through view of internal anatomy and palpation will be visible to more attendees. One demo presenter will stand next to the participant doing the examination while the other one will engage with the rest of the audience. Participants can do an examination in less than a minute. There are no potential safety risks associated to our demo. The demonstration can be set up in about an hour. The standard space allocation is sufficient and we require 4 power sockets. Two persons are typically enough to explain the system. Suggested floor plan suffices our demonstration.

 

D2.59 Active Robotic Hand Illusion with vibrotactile feedback
Jakob Fröhner1,2, Hesse Thomas1, The Vu Huynh1, Philipp Beckerle1
1Technische Universität Darmstadt, 2University of Siena

 

Abstract: In the demonstration, the Rubber Hand Illusion (RHI) is induced using an actively moving robotic hand with tactile feedback in the fingertips. The Rubber Hand Illusion (RHI) is a phenomenon in which participants perceive a rubber hand as their own hand, after the rubber hand and the hidden real hand were stroked simultaneously (Botvinick, Cohen, & others, 1998). Recent investigations show that this illusion also occurs if the artificial hand is an active moving hand, e.g. a robotic hand, that mimics the hidden participant’s hand (Caspar et al., 2015). The experimental setup considers implications from Beckerle et al. (2016) which analyzed design requirements of robotic hand and leg illusions. In the demonstration, the attendee is sitting in front of the table while his or her right arm is covered by a black box to hide it from the attendee’s view. The robotic hand is positioned on the table next to the black box. The position of the robotic hand is anatomically plausible to the attendee’s body and its visual appearance is similar to a human hand since the robotic hand is likewise covered by a glove. The attendee’s hidden hand is wearing a sensory glove that captures the attendee’s voluntary fingers movement separately. Thus, each robotic finger mimics the corresponding attendee’s finger. If the robotic fingers touch an object, the sensory glove provides vibrotactile feedback in the fingertips. The intensity of this tactile feedback is modulated according to the contact forces between the robotic fingers and the environment. In contrast to other investigations, e.g. Caspar et al. 2014, Kalckert & Ehrsson 2014, the demonstration includes individual movement of each finger and vibrotactile feedback.

References
Beckerle, P., De Beir, A., Schürmann, T., & Caspar, E. A. (2016). Human body schema exploration: Analyzing design requirements of Robotic Hand and Leg Illusions. In Robot and Human Interactive Communication (RO-MAN), 2016 25th IEEE International Symposium on (pp. 763–768). IEEE.
Retrieved from Botvinick, M., Cohen, J., & others. (1998). Rubber hands’ feel’touch that eyes see. Nature, 391(6669), 756–756. Caspar, E. A., De Beir, A., Magalhaes De Saldanha Da Gama, P. A., Yernaux, F., Cleeremans, A., & Vanderborght, B. (2015). New frontiers in the rubber hand experiment: when a robotic hand becomes one’s own. Behavior Research Methods, 47(3), 744–755.
Kalckert, A., & Ehrsson, H. H. (2014). The moving rubber hand illusion revisited: Comparing movements and visuotactile stimulation to induce illusory ownership. Consciousness and Cognition, 26, 117–132.

 

D2.60 Linepod: a portable sensemaking platform for the blind
Oliver Schneider1, Kevin Reuss1, Nico Boeckhoff1, Julius Rudolph1, Adrian Kuchinke1, David Stangl1, Robert Kovacs1, Patrick Baudisch1
1Hasso Plattner Institute

 

Abstract: Current displays for the blind, such as Braille displays and screen readers, tend to primarily convey text. In order to allow displaying spatial information, such as charts and maps, or laid out documents, such as slides, posters, or web pages, we previously presented Linespace–a large interactive tactile display that uses lines as its primitive. In this demo, we present Linepod, a redesign of Linespace that produces very high-quality raised tactile lines in a self-contained portable form factor. The key element of the device is a 2W laser mounted into a knife cutter that lasers lines into legal-sized sheets of swell paper under the control of the user’s mobile phone. Applications typically start by creating a document composed from raised lines, allows users to interact with the document using touch and speech in/output, then interactively refine the document by plotting additional lines. We demonstrate selected Linepod applications, including a tactile web browser and a photo-taking and post-processing application.

 

D2.61 Vibrotactile Rendering of Camera Motion for Bimanual Experience of First-Person View Videos
Daniel Gongora Flores1, Hikaru Nagano1, Masashi Konyo1, Satoshi Tadokoro1
1Tohoku University

 

Abstract: We propose a vibrotactile rendering method for the motion of the camera in first-person view videos. This method uses two vibrotactile actuators and enables people to feel two types of haptic effects with the hands: transient and motion. In the first case, we generate transient effects based on the vertical component of the estimated camera displacements. In the second case, we produce motion effects that travel from hand to hand by modulating constant frequency sinusoidal waves with the estimated horizontal displacements of the camera. In relation to similar works that strive to improve the video-watching experience using haptic feedback, this project is characterized by its simplicity and portability. The setup consists of just two vibrotactile actuators and there are no special requirements as for the video recording settings. Prolonged exposure to first-person view videos may cause motion sickness in some participants.

 

D2.62 FLEXMIN - A Teleoperated Parallel Robot for Single Incision Surgery
Johannes Bilz1, Sebastian Matich1, Carsten Neupert1, Johanna Miller2, Jonas Johannink2, Christian Hatzfeld1
1Technische Universität Darmstadt, 2University Hospital Tübingen

 

Abstract: The demonstration presents the FLEXMIN robot, a teleoperation system based on parallel kinematics for single-incision surgery. The system will be equipped with a gripper and a high-frequency cauterization device to demonstrate surgical tasks such as cutting and manipulate tissue. Two sensors are incorporated into the gripper to deliver haptic feedback to the user, a classical force sensor integrated in the proximal part of the parallel robot and an acceleration sensor attached to the tip of the instruments. Two distinct mechanisms (a delta kinematic mechanism and a voice coil actuator) convey signals to the user. The cauterization device will only provide low-frequency feedback based on proximal sensors in the parallel kinematic structure and a delta mechanism. The interaction scene of the robot is shielded from direct user interaction to minimize safety risks (such as HF-cauterization or mechanical damage through the robot). Emergency buttons are included in the setup, each attendee will receive a thorough introduction into the system. The system was previously demonstrated at the IROS conference in 2015 but without haptic feedback capabilities. The demonstration needs a minimum space of 3 m width and 2.5 m depth, but only a single chair and two 230V outlets with separate fuses.

 

D2.63 Providing Haptics to Walls & Heavy Objects in Virtual Reality by Means of Electrical Muscle Stimulation
Pedro Lopes1, Sijing You1, Lung-pan Cheng1, Sebastian Marwecki1, Patrick Baudisch1
1Hasso Plattner Institute

 

Abstract: We explore how to add haptics to walls and other heavy objects in virtual reality. When a user tries to push such an object, our system actuates the user’s shoulder, arm, and wrist muscles by means of electrical muscle stimulation, creating a counter force that pulls the user’s arm backwards. Our device accomplishes this in a wearable form factor, which allows for real-walking VR without obstructions. We devised and tested two visuo-haptic designs that provide our virtual objects with different types of haptic effects: (1) the repulsion design (visualized as an electrical field) and (2) the soft design (visualized as a magnetic field). We demonstrate the effectiveness of our VR obstacle designs by letting participants experience a virtual world in which all objects provide haptic EMS effects, including walls, gates, sliders, boxes, and projectiles.

 

D2.64 Presenting pseudo-haptic feedback in immersive VR environment by modifying avatar’s joint angle
Jotaro Shigeyama1, Nami Ogawa1, Takuji Narumi1, Tomohiro Tanikawa1, Michitaka Hirose1
1The University of Tokyo

 

Abstract: We propose a system that presents a feeling of resistive force by modifying joint angles of user’s avatar, and an approach to reduce a feeling of discomfort evoked by a conflict between visual and proprioceptive sensations. Pseudo-haptic feedback enables us to provide haptic sensations by making discrepancy between the position of user’s body in the real world and avatar which represents a part of user’s body in an immersive virtual environment without using any complicated devices. However, a larger discrepancy between proprioceptive and visual sensations can cause a feeling of discomfort to a user, which leads to reduce the effectiveness of pseudo-haptic feedback. Also, modifying a displacement of a body part cannot maintain a consistency of whole body parts of a user and an avatar under an immersive virtual environment. To avoid these problems, we proposed a pseudo-haptic approach of modifying a joint angle of avatar. Our experiments showed that modifying multiple joint angles of an avatar’s arm can reduce a feeling of discomfort but still presents a certain intensity of resistive force, compared to changing only a single joint angle. In the demonstration we show some applications of our proposed system using pseudo-haptic feedback system in a conventional VR system.

 

D2.65 Wearable Cutaneous Haptic Device with Soft Stretch Sensors and IMUs
Yongjun Lee1, Myungsin Kim1, Yongseok Lee1, Dongjun Lee1
1Seoul National University

 

Abstract: We introduce a new wearable cutaneous haptic device (WCHD) that brings a synergy of hard devices and soft stretch sensors (SSS). We implement the WCHD with SSS and an IMU. The SSS measures the relative position/rotation between the upper WCHD body and lower contact plate, which is hard to obtained only by IMUs due to the magnetic interference or bulky if implemented by servo-motors with rotary encoders. We plan to prepare a live three-finger manipulation for the virtual haptic interaction using our WCHD. We will demonstrate the performance of our WCHD through the real time multi-contact virtual peg-in-hole scenario based on PMI (passive midpoint integration) haptic rendering framework. If possible, we also plan to do virtual tele-manipulation over the internet. There would be no safety issues to wear and test/demo with the WCHD and haptic simulation.

 

D2.66 The Hapticore device
Myrthe Plaisier1, Ernst Jan Bos2, Reinier Hill2, Jeroen Smeets3
1Delft University of Technology, 2Hapticore, 3Vrije Universiteit Amsterdam

 

Abstract: Here we will demonstrate the Hapticore device: an example of a new class of haptic interfaces. Magneto-Rheological Fluid (MRF) is used to manipulate the global viscosity around the hand. This allows virtual objects to be rendered by letting the liquid solidify as soon as the hand reaches the virtual boundaries of the simulated object. Earlier attempts used complex magnetic force field designs to create local MRF viscosity changes for haptic feedback. The current approach applies a homogenous viscosity change in combination with visual information to create the sensation of grasping an object, offering a compact, efficient and cost-effective solution. Participants will wear a data-glove that allows tracking of the hand posture and visualization of the hand on a monitor while it is inside the device. They insert their hand with data-glove into a rubber glove that is fixed in a container of magnetic liquid (iron particles in silicon oil). By applying a weak magnetic field (max ~0.25 Tesla; metal objects can be safely inserted) the viscosity of the liquid is controlled. Viscosity ranges from thin oil to the consistency of unbaked clay. The power supply is medically certified and the liquid is isolated with two layers of electrical insulation and two layers of mechanical insulation.

 

D2.67 HANDSON-SEA: A Series Elastic Educational Robot for Physical Human Robot Interaction
Ata Otaran1, Ozan Tokatli2, Volkan Patoglu1
1Sabancı University, 2University of Reading

 

Abstract: For gaining proficiency in physical human-robot interaction (pHRI), it is crucial for engineering students to be provided with the opportunity to physically interact with and gain hands-on experience on design and control of force feedback robotic devices. We present HandsOn-SEA, a single degree of freedom educational robot that features series elastic actuation and relies on closed loop force control to achieve the desired level of safety and transparency during physical interactions. The proposed device complements the existing impedance-type Haptic Paddle designs by demonstrating the challenges involved in the synergistic design and control of admittance-type devices. Unlike the Haptic Paddle designs, the series elastic robot necessitates two position sensors: one for measuring the motor rotations and another for measuring the deflections imposed on the elastic element. The deflections of the cross-flexure pivot are measured using a Hall-effect sensor. A low cost PWM voltage amplifier is used to drive the DC motor. We have implemented controllers for the SEA robot using a low-cost micro-controller. HandsOn-SEA was presented in Eurohaptics2016 conference and presented as a demo in both in the conference and in the following Touch and Go event. HandsOn-SEA is used in a senior level mechatronics course twice and has been offered as an open hardware in the website of HMI Laboratory of Sabanci University.

 

D2.68 HapticHead: A Spherical Vibrotactile Grid around the Head WHC Demo 2017
Oliver Beren Kaul1, Michael Rohs1
1Leibniz University of Hannover

 

Abstract: Current virtual and augmented reality head-mounted displays usually include no or only a single vibration motor for haptic feedback and do not use it for guidance. We present HapticHead, a system utilizing multiple vibrotactile actuators distributed in three concentric ellipses around the head for intuitive haptic guidance through moving tactile cues. In our CHI 2017 accepted paper (paper and video already available at http://hci.uni-hannover.de/research/haptichead ) we conducted three experiments, which indicate that HapticHead vibrotactile feedback is both faster (2.6 s vs. 6.9 s) and more precise (96.4 % vs. 54.2 % success rate) than spatial audio (g-HRTF) for finding visible virtual objects in 3D space around the user. Mean final precision with HapticHead feedback on invisible targets is 2.3° compared to 0.8° with visual feedback. We successfully navigated blindfolded users to real household items at different heights using HapticHead vibrotactile feedback independently of a head-mounted display. In the WHC demo we plan to show our current prototype in combination with an Oculus Rift and VR demos: immersive perceptible rain, virtual tactile wall, virtual tactile butterfly, shockwaves, and guidance demos for both visible and invisible targets.

 

D2.69 An Ultrasound-based Audio-Tactile System
Georgios Korres1, Mohamad Eid1
1New York University Abu Dhabi

 

Abstract: A promising tactile display technology involves focusing multiple ultrasound beams to generate tangible tactile sensation in mid-air. One limitation is the audible noise. To address this limitation, we present a method where the audible noise is modulated with sound effects to create a tactile-auditory simulation system. The demonstration setup consists of three components (Fig. 1): hologram display, ultrasonic tactile display, and a hand/finger tracker. The hologram display consists of an Aerial Imaging (AI) mirror at 45º so a floating 2D image is displayed in mid-air on top of the screen. The ultrasound tactile display is a one-tile of the Haptogram system (10×10 transducers). The hand tracking device is the LEAP motion sensor. The AI display projects a linear potentiometer knob in mid-air as a controller for a table bulb. When the user swipes the virtual knob the hand tracking sensor captures the user’s gesture and activates the ultrasound tactile display at the contact point. Two effects are created: (1) a linear change of intensity of the tactile stimulation to simulate contact and (2) the acoustic noise is modulated in a way to play swiping sound effect. The swiping interaction is used to turn on/off a physical bulb.

 

D2.70 Enhancing Human-Computer Interaction via Electrovibration
Senem Ezgi Emgin1, Bushra Sadia1, Yasemin Vardar1, Cagatay Basdogan1
1Koç University

 

Abstract: We present a compact tablet that displays electrostatic haptic feedback to the user. We track user’s finger position via an infrared frame and then display haptic feedback through a capacitive touch screen based on her/his position. In order to demonstrate practical utility of the proposed system, the following applications have been developed: (1) Online Shopping application allows users to be able to feel the cord density of two different fabrics. (2) Education application asks user to add two numbers by dragging one number onto another in order to match the sum. After selecting the first number, haptic feedback assists user to select the right pair. (3) Gaming/Entertainment application presents users a bike riding experience on three different road textures -smooth, bumpy, and sandy. (4) User Interface application in which users are asked to drag two visually identical folders. While dragging, users are able to differentiate the amount of data in each folder based on haptic resistance.

 

D2.71 Hand-held Surface Material Classification Systems
Matti Strese1, Eckehard Steinbach1
1Technical University of Munich

 

Abstract: We present a content-based surface material retrieval (CBSMR) system for tool-mediated freehand exploration of surface materials. We demonstrate a custom 3d printed sensorized tool and a common smartphone using an android-based surface classification app that are able to classify among a set of surface materials without relying on explicit scan force and scan velocity measurements. The systems rely on features representing the main psychophysical dimensions of tactile surface material perception, namely friction, hardness, macroscopic roughness, microscopic roughness and warmth from our prior work. We use a haptic database consisting of 108 surface materials as reference for a live demonstration of both devices for the proposed task of surface material classification during the conference.

 

D2.72 Tactile Computer Mouse for Interactive Surface Material Exploration
Matti Strese1, Andreas Noll1, Eckehard Steinbach1
1Technical University of Munich

 

Abstract: We demonstrate the prototype of a novel tactile input/output device for the exploration of object surface materials on a computer. Our proposed tactile computer mouse is equipped with a broad range of actuators to present various tactile cues to a user, but preserves the input capabilities of a common computer mouse. We motivate the design and implementation of our device by the insights of the work in [1], which reveals the five major tactile dimensions relevant for human haptic texture and surface material perception and provide tactile feedback for these tactile dimensions, namely microscopic/macroscopic roughness, friction, hardness and warmth. We use a haptic database from our prior work to provide the necessary signals (e.g., acceleration, image or sound data) for the recreation of surface materials and display them using our surface material rendering application.

 

D2.73 Demonstration of Realistic Friction Rendering using Time-Varying Friction Coefficients
Shoichi Hasegawa1, Yoshiki Hashimoto1, Hironori Mitake1
1Tokyo Institute of Technology

 

Abstract: We propose to use time-varying friction coefficients, which was time-constant in conventional haptic renderings. The proposed method reproduces more realistic stick-slip vibrations comparing to conventional haptic renderings. We implemented it with a haptic interface SPIDAR and the audiences will evaluate effectiveness of the proposed method by comparing friction sensation in virtual and real worlds.

 

D2.74 Electroadhesive Audio-Tactile-Visual Workstation
Craig Shultz1, Michael A. Peshkin1, Edward Colgate1
1Northwestern University

 

Abstract: We demonstrate a combined audio-tactile-visual workstation in which a user’s finger encounters software-generated haptic effects, and also serves as a co-located origin of audible sound. This is achieved through a broadband electroadhesive modulation technique on a transparent electrode surface with a visual monitor beneath. Custom high-bandwidth electronics makes it possible to apply a wide range of tactile, audible, and ultrasonic forces to the finger using variable friction. Touch position and velocity, measured by the system using an IR touch frame, are available as inputs to render dynamic audio-tactile feedback via the finger in real-time. A flexible software framework allows us to create frictional effects that develop temporally or spatially. The software is based on cross-platform compatible open source C++ audio libraries. This allows for ease of use, modularity, and bootstrapping of existing audio rendering software and tools. Please note: this system applies small currents to the finger using a high voltage compliant, current-limiting amplifier.

 

D2.75 AeroFinger: A 3-DOF Fingertip Haptic Display using Scalable and Miniature 3D Printed Airbags
Yuan-Ling Feng1, Charith Lasantha Fernando1, Kouta Minamizawa1
1Keio University

 

Abstract: We present a novel wearable fingertip haptic display ”AeroFinger” that is very light to wear, small enough to fit in the fingertip and uses no electro-mechanical actuation to render the 3DOF force feedback sensation. AeroFinger display component consists of 4 miniature airbags which is made out of 3D printed Rubber- Like material so that the display size, strength and shape can be customized by the user. A small sized full range speaker is mounted on a closed air chamber where the air is transferred through a tiny nozzle to the airbag. The Speaker movements are translated into airbag inflation and deflation, thus the AeroFinger can display the low frequency vibrations as force sensation and high frequency vibrations as tactile sensation. For demonstration, a Unity 3D application which allow user to interact with virtual objects is developed. There are tow models in the application, one is support directional pressures when user touching a soft ball or pinching a cup, the other is support texture feedback when user stroking materials’ surface.

 

D2.76 Physically Interactive Exercise Games with a Baxter Robot
Naomi Fitter1, Katherine Kuchenbecker1,2
1University of Pennsylvania, 2Max Planck Institute for Intelligent Systems

 

Abstract: As the population of older adults increases beyond the capacity of nursing home facilities, we are faced with a growing need to keep an aging population safe and healthy in their own homes. In response to this situation, some researchers have designed robots to promote activity for both physical rehabilitation and general physical exercise. As part of this body of work, our research explores using the Rethink Robotics Baxter Research Robot to promote the types of light exercise shown to keep older adults healthy. Since physical contact is a crucial facet of human social connections, our designed exercise interactions center on hand-to-end-effector touch interactions. We give Baxter haptic intelligence by monitoring signals from the robot’s onboard sensors to determine when contact has occurred and accordingly progress through the designed games. The exercise games also use facial animations, sounds, and motions to engage people and motivate them to participate in gameplay. In our demonstration, visitors will have the chance to play our human-robot exercise games and ask questions about our work. Because this project will be open-source by the summer of 2017, we will also share information on how other researchers can reimplement our work.