Designing for Gesture and Tangible Interaction. Mary Lou MaherЧитать онлайн книгу.
devices for control (e.g., mouse) and display (e.g., screen) (Fishkin, 2004). A person uses their hands to manipulate one or more physical objects via gestures and actions such as pointing, clicking, holding, and grasping. A computer system detects the movement, changes its state, and provides feedback (Petridis et al., 2006). TUIs are designed to build on our experience and skills from interacting with the non-digital world (Ishii and Ullmer, 1997; Shaer and Jacob, 2009). TUIs offer the possibility of natural interfaces that are intuitive and enjoyable to use as well as easy to learn (Shaer, 2008). TUIs have the potential to enhance learning and problem solving by changing the way people interact with and leverage digital information (Shaer and Jacob, 2009). Current research in tangible interaction includes understanding the design and cognitive implications of TUIs, developing new technologies that further bridge the digital and the physical, and guiding TUI design with knowledge gained from empirical studies.
The goal of this chapter is to provide an overview and general framework for the design of tangible interaction, including consideration of the role of gesture and the impact on cognition. We believe that TUIs have an impact on cognition because they provide affordances that encourage and facilitate specific gestures and actions, making some cognitive activities easier. TUIs change the way we interact with digital information via physical affordances that are distinctly different from pointing and keyboard/mouse interaction. This chapter explores physical manipulation as an interaction design space. TUIs trigger various gestures and have potential for exploring information through novel forms of interacting and discovering. The chapter presents the concepts and design issues of TUIs through two examples: the Tangible Keyboard and Tangible Models. They exemplify two approaches to integrating TUIs with traditional interaction design: the Tangible Keyboard design examines the metaphor of a keyboard where each key is a tangible object; Tangible Models design examines physical interaction with 3D objects as proxies for 3D digital models.
2.1.1 TANGIBLE KEYBOARD
Figure 2.2: Pattern Maker application on the Tangible Keyboard. Tangible Keyboard video available on: https://drive.google.com/file/d/0B4S5ptYjjjuGbFEyX2ljUk90LVU/view?usp=sharing.
Tangible Keyboard is a tangible computing platform that adopts a keyboard metaphor in developing tangible devices for touch screen tablets. The tangible interaction design has a focus on supporting composition tasks and the potential for enhancing creative cognition through spatial arrangement. The Tangible Keyboard design provides separate interaction spaces for composition tasks: the whole composition is displayed on the tablet, and the individual pieces of the composition are manipulated on tangible interactive objects. Individual elements are displayed on tangible interactive objects (inspired by Sifteo cubes™), and these smaller displays are manipulated to create a composition on a larger touch display tablet (Merrill et al., 2012). A variety of different gestures and actions on the tangible objects serve as the basis for the interaction design of the Tangible Keyboard. The larger display on a tablet provides visual feedback for compositions and the touch screen allows users to interact with on-screen content. The affordances of the Tangible Keyboard build on the idea of creating keys, similar to the keys on a keyboard, where the symbols on the keys are interactive, and the keys can be rearranged to create variety of creative patterns. Figure 2.2 illustrates the Tangible Keyboard design with the Pattern Maker application.
2.1.2 TANGIBLE MODELS
Figure 2.3: Tangible Models interaction design for CAD modeling.
Tangible Models is a tangible computing platform that combines a touchscreen tabletop system with augmented reality that integrates tangible objects on a horizontal display to support 3D configuration design tasks (Kim and Maher, 2008). This tabletop system provides a physical and digital environment for co-located design collaboration. The tabletop system runs a computer-aided design (CAD) program to display a plan view of a 3D design, with physical augmented reality blocks representing objects and their placement on the plan view. Tangible Models interaction design uses 3D blocks with markers that reference 3D models in the ARToolKit (https://artoolkit.org/). Using ArchiCAD (http://www.graphisoft.com/archicad/), Tangible Models allows the user to arrange 3D models from a library, such as walls, doors, and furniture. The ArchiCAD library provides pre-designed 3D objects that can be selected, adapted, and placed in the new design. Tangible Models interaction design comprises selection and rearrangement actions on blocks to explore alternative configuration designs. By rearranging 3D models as physical actions on blocks, the affordances of this UI reduces cognitive load by providing direct manipulability and intuitive understanding of the spatial relationships of the components of the design. Figure 2.3 illustrates the Tangible Models platform using 3D models of furniture from the ArchiCAD library.
2.2 WHY IS TANGIBLE INTERACTION INTERESTING?
TUIs represent a departure from conventional computing by connecting digital information with graspable objects in the physical world (Fishkin, 2004). Fitzmaurice (1996) defines five core properties as the major differences between tangible interaction devices and mouse/keyboard interaction devices:
1. space-multiplexing of both input and output;
2. concurrent access and manipulation of interface components;
3. strong specific devices;
4. spatially-aware computational devices; and
5. spatial re-configurability of devices.
A hallmark of TUIs is specialized physical/digital devices that provide concurrent access to multiple input devices that can control interface widgets as well as afford physical manipulation and spatial arrangement of digital information and models (Fitzmaurice, 1996; Fitzmaurice and Buxton, 1997; Shaer and Hornecker, 2010). These characteristics affect the way tangible interaction is designed. In addition, tangible interaction is contextual: the design is strongly affected by the context of use. The Tangible Keyboard is designed for composition of elements that do not have a corresponding 3D physical object, such as words, numbers, or 2D shapes. The Tangible Models platform is designed for the composition of elements that have a 3D physical counterpart. We explore these 5 factors and their characteristics to better understand design principles for TUI in the context of the Tangible Keyboard and Tangible Models.
2.2.1 SPACE-MULTIPLEXED INPUT AND OUTPUT
Space-multiplexed input and output involves having multiple physical objects, each specific to a function and independently accessible (Ullmer and Ishii, 1997). Time-multiplexed input and output occurs when only one input device is available (for example, the mouse): the user has to repeat edly select and deselect objects and functions (Shaer and Hornecker, 2010). For example, the mouse is used to control different interaction functions such as menu selection, scrolling windows, pointing, and clicking buttons in a time-sequential manner (Jacko, 2012). TUIs are space-multiplexed because they typically provide multiple input devices that are spatially aware or whose location can be sensed by the system. As a result, input and output devices are distributed over space, enabling the user to select a digital object or function by grasping a physical object (Shaer and Hornecker, 2010; Patten and Ishii, 2007).
1) Tangible Keyboard
Tangible Keyboard has space-multiplexed input/output devices, which enables graspable rearrangement of the elements of a composition. This design provides a distinct approach to composition that is not supported by the traditional keyboard or mouse owing to the ability to manually rearrange subsets of a composition and control