Colocated Sound and Interaction

Sound, Vibration, and Retroaction in Deformable Displays

Introduction

Visual displays on deformable surfaces are compelling because they are easy to adapt to for the generation of users who have become heavily trained in ocularcentric, planar screen touch interactions since their invention in the mid 1980’s. Visual display is not the only form of output possible on deformable surfaces and for many applications will not turn out to be the best.

The two well-known problems with colocating deformation and image are:

  • Occlusion and visual distortions right at the points of interaction that are the most likely to be interesting to the user.
  • Difficulty mapping obvious sensed parameters (depth or pressure) to visual form with commensurate resolution.

    We are exploring other sense modalities for deformable displays that avoid these problems and offer interesting affordances for display designers.

    Sound is one of those modalities that is so obvious and pervasive that it may be easily overlooked. The state of the art in terms of human expressivity and interactivity in deformable displays is the Indian tabla drum. Good tabla players have sufficient control to be able play melodies on the surface using palm pressure to control pitch and their fingers to sound the drum with a wide variety of dynamics and timbral change. We are still very far from being able to sense or actuate a surface with sufficient precision in time and space to support the control musicians have over such surface interactions. The last musical instrument to be developed in the west with comparable control was the Clavichord. The keys of the clavichord couple the fingers to what become the movable bridges of multiple strings. Good clavichord players have a control of dynamic range that equal that of the piano in addition to pitch and timbral control – both impossible for the piano. The piano displaced use of the clavichord by being much louder and more suitable for the concert hall than the parlor.

    HCI designers usually want more control over the mapping between input and output than a special purpose musical instrument provides. The insertion of digital computation and electrical transduction involves too many compromises to discuss fully here. We will instead focus on a few key challenges using the author’s work and examples from various collaborators.


    Figure 1: the Tablo

    The author’s “Tablo” is a controller designed to capture palm and finger gestures that a table player might use. Sound output is emitted from below the surface. An array of programmable lights provides visual feedback (analogous to the frets of a guitar) shining through the translucent conductive stretchable fabric. This fabric drapes over electrically resistive strips, lowering their resistance which then becomes a measure of displacement. Piezoresistive fabric segments around the annular base of the instrument are used to estimate pressure and provide a reference point to measure dynamic changes in displacement as an estimate of finger velocity.

    Anna Flagg and Hannah-Perner Wilson have both adapted the two main piezoresistive pressure surface design patterns [2] to deformable applications using stretchable conductive and piezoresistive fabrics.

    The basic construction is illustrated for Flagg’s “Cuddlebot” in figure 2.

    Figure 2: Cuddlebot construction

     The Cuddlebot is an affective robot that displays its “emotional” responses as sounds and vibration according to the nature of sensed interactions with its body surface.

     

     

     

     

     

     

    Figure 3 shows Perner-Wilson’s robot skin. It uses a tubular adaptation of a basic fabric pressure multitouch design of  Figure 4 [3]. Display in this case is retroaction in the form of motion of segments of the robot arm.

    Figure 3: Robot Skin

    Figure 4:Piezorestive Pressure Multitouch

    Figure 5 shows the author’s most recent experiment towards capturing the responsiveness of hand drums [1]. From the display point of view it is interesting because the fabric sensing materials are built directly on the moving surface of loudspeaker drivers. This colocates sensing, audio and vibrotactile stimulation and leverages recent developments in control theory to make the coupled systems stable and also new sound transducers that have a flat and robust diaphragm.

    drum2drum1
    Figure 5: Colocation

    Discussion and Conclusion

    Two major challenges persist in the engineering of the display systems presented: temporal and spatial resolution. The legacy from office automation applications of the 1980’s of low frame rates (30-60Hz) and low input data rates (<100Hz) persists and pervades in the implementation of desktop and mobile operating systems. New applications of these displays in gaming, music and other situations where tight multimodal integration is important require controlled latencies and sample rates better than 1kHz. We have shown that the sensing-data-rate problem can be finessed by translating the data into digital audio [4]. The spatial resolution issue is hard to solve currently without more reliable ways to connect the stretchable materials to the rigid materials supporting the electronics. One of the more promising approaches to this problem is to build the electronics with stretchable or bendable materials.

    References

    [1] Freed, A. Integration of Touch Pressure and Position Sensing with Speaker Diaphragms (Colocating Loudspeakers and Touch Interaction) Audio Engineering Society, AES, San Francisco, 2012.

    [2] Freed, A. Novel and Forgotten Current-steering Techniques for Resistive Multitouch, Duotouch, and Polytouch Position Sensing with Pressure NIME, 2009.

    [3] Schmeder, A. and Freed, A. Support Vector Machine Learning for Gesture Signal Estimation with a Piezo-Resistive Fabric Touch Surface NIME, Sydney, 2010.

    [4] Wessel, D., Avizienis, R., Freed, A. and Wright, M. A Force Sensitive Multi-touch Array Supporting Multiple 2-D Musical Control Structures New Interfaces for Musical Expression, New York, 2007.

    Sound, Vibration, and Retroaction in Deformable Displays, Freed, Adrian , SIGCHI Workshop: Organic experiences: (re)shaping interactions with deformable displays, 04/2013, Paris/France, (2013)
  • Syndicate content