Эротические рассказы

The Handbook of Multimodal-Multisensor Interfaces, Volume 1. Sharon OviattЧитать онлайн книгу.

The Handbook of Multimodal-Multisensor Interfaces, Volume 1 - Sharon Oviatt


Скачать книгу
1.3 From: M. Ernst and H. Bulthoff. 2004. Merging the senses into a robust percept. Trends in Cognitive Sciences, 8(4):162–169. Copyright © 2004 Elsevier Ltd. Used with permission.

      Figure 1.4 (left) From: S. Oviatt, A. Cohen, A. Miller, K. Hodge, and A. Mann. 2012b. The impact of interface affordances on human ideation, problem solving and inferential reasoning. In Transactions on Computer Human Interaction. Copyright© 2012 ACM. Used with permission.

      Figure 2.1 From: B. E. Stein, T. R. Stanford, and B. A. Rowland. 2014. Development of multisensory integration from the perspective of the individual neuron. Nature Reviews Neuroscience, 15(8):520–535. Copyright © 2014 Macmillan Publishers Ltd. Used with permission.

      Figure 2.2 From: K. H. James, G. K. Humphrey, and M. A. Goodale. 2001. Manipulating and recognizing virtual objects: where the action Is. Canadian Journal of Experimental Psychology, 55(2):111–120. Copyright © 2001 Canadian Psychological Association. Used with permission.

      Figure 2.3 From: The Richard D. Walk papers, courtesy Drs. Nicholas and Dorothy Cummings Center for the History of Psychology, The University of Akron.

      Figure 2.4 From: A. F. Pereira, K. H. James, S. S. Jones, and L. B. Smith. 2010. Early biases and developmental changes in self-generated object views. Journal of Vision, 10(11):22:1–13. Copyright © 2010 Association for Research in Vision and Ophthalmology. Used with permission.

      Figure 2.6 From: A.J. Butler and K. H. James. 2013. Active learning of novel sound-producing objects: motor reactivism and enhancement of visuo-motor connectivity. Journal of Cognitive Neuroscience, 25(2):203–218. Copyright © 2013 Massachusetts Institute of Technology.

      Figure 2.7 From: A.J. Butler and K. H. James. 2013. Active learning of novel sound-producing objects: motor reactivism and enhancement of visuo-motor connectivity. Journal of Cognitive Neuroscience, 25(2):203–218. Copyright © 2013 Massachusetts Institute of Technology.

      Figure 2.8 From: K. H. James and I. Gauthier. 2006. Letter processing automatically recruits a sensorymotor brain network. Neuropsychologia, 44(14):2937–2949. Copyright © 2006 Elsevier Ltd. Used with permission.

      Figure 2.9 From: K. H. James and T. Atwood. 2009. The role of sensorimotor learning in the perception of letter-like forms: Tracking the causes of neural specialization for letters. Cognitive Neuropsychology, 26(1):91–110. Copyright © 2009 Taylor & Francis. Used with permission.

      Figure 2.10 From: K. H. James and S. N. Swain. 2011. Only self-generated actions create sensori-motor systems in the developing brain. Developmental Science, 14(4):673–687. Copyright © 2011 John Wiley & Sons Inc. Used with permission.

      Figure 2.11 From: K. H. James and L. Engelhardt. 2012. The effects of handwriting experience on functional brain development in per-literate children. Trends in Neuroscience and Education, 1:32–42. Copyright © 2012 Elsevier Ltd. Used with permission.

      Figure 2.13 From: A. F. Pereira, K. H. James, S. S. Jones, and L. B. Smith. 2010. Early biases and developmental changes in self-generated object views. Journal of Vision, 10(11):22:1–13. Copyright © 2010 Association for Research in Vision and Ophthalmology. Used with permission.

      Figure 2.15 From: Boring, E. G. (1964). Size-constancy in a picture. The American Journal of Psychology, 77(3), 494–498. Copyright© 1964 University of Illinois Press. Used with permission.

      Figure 2.16 From: K. H. James and I. Gauthier. 2009. When writing impairs reading: Letter perception’s susceptibility to motor interference. Journal Experimental Psychology General, 138(3):416. Copyright © 2009 American Psychological Association. Used with permission.

      Figure 3.2 (bottom left) From: O. S. Schneider and K. E. MacLean. 2016. Studying Design Process and Example Use with Macaron, a Web-based Vibrotactile Effect Editor. In Proceedings of the Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems—HAPTICS ’16. Copyright © 2016 IEEE. Used with permission.

      Figure 3.2 (bottom right) Courtesy of Anton Håkanson

      Figure 3.4 Based on: B. Buxton. 2007. Sketching User Experiences: Getting the Design Right and the Right Design. Morgan Kaufmann Publishers Inc.

      Figure 4.1 From: K. Hinckley, J. Pierce, M. Sinclair, and E. Horvitz. 2000. Sensing techniques for mobile interaction. In Proceedings of the 13th Annual ACM Symposium on User Interface Software and Technology, pp. 91–100. San Diego, CA. Copyright © 2000 ACM. Used with permission.

      Figure 4.2 Based on: W. Buxton. 1995. Integrating the periphery and context: A new taxonomy of telematics. In Proceedings of Graphics Interface ’95. Quebec City, Quebec, Canada.

      Figure 4.3 Based on: W. Buxton. 1995. Integrating the periphery and context: A new taxonomy of telematics. In Proceedings of Graphics Interface ’95. Quebec City, Quebec, Canada.

      Figure 4.4 Image courtesy of iStock.com/monsitj. Webpage © Bill Buxton http://www.billbuxton.com/multitouchOverview.html.

      Figure 4.5 (video) From: M. Wu and R. Balakrishnan. 2003. Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays. In Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology (UIST ’03), pp. 193–202, Vancouver, BC, Canada. Copyright © 2003 ACM. Used with permission.

      Figure 4.6 (video) From: C. Harrison, R. Xiao, J. Schwarz, and S. E. Hudson. 2014. TouchTools: leveraging familiarity and skill with physical tools to augment touch interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’14). Copyright © 2014 ACM. Used with permission.

      Figure 4.7 From: M. Annett, A. Gupta, and W. F. Bischof. 2014. Exploring and Understanding Unintended Touch during Direct Pen Interaction. ACM Transactions on Computer-Human Interaction 21(5): Article 28 (39pp). Copyright © 2014 ACM. Used with permission.

      Figure 4.8 Image courtesy of Julia Schwarz, http://juliaschwarz.net.

      Figure 4.8 (video) From: J. Schwarz, R. Xiao, J. Mankoff, S. E. Hudson and C. Harrison. 2014. Probabilistic palm rejection using spatiotemporal touch features and iterative classification. In CHI’14. Toronto, Canada. Copyright © 2014 ACM. Used with permission.

      Figure 4.9 Video courtesy of P. Brandl, J. Leitner, T. Seifried, M. Haller, B. Doray, &P. To. Used with permission.

      Figure 4.10 (video) From: I. Siio and H. Tsujita. 2006. Mobile interaction using paperweight metaphor. In Proceedings of the 19th Annual ACM Symposium on User Interface Software and Technology (UIST ’06). Copyright © 2006 ACM. Used with permission.

      Figure 4.11 (video) Courtesy of M. Annett, A. Gupta, and W. F. Bischof. Used with permission.

      Figure 4.12 Based on: K. Hinckley and H. Song. 2011. Sensor synaesthesia: Touch in motion, and motion in touch. CHI ’11. Vancouver, BC, Canada. ACM, New York.

      Figure 4.13 (video) From: K. Hinckley and H. Song. 2011. Sensor synaesthesia: Touch in motion, and motion in touch. CHI ’11. Vancouver, BC, Canada. Copyright © 2011 ACM. Used with permission.

      Figure 4.14 (video) From: M. Goel, J. Wobbrock, and S. Patel. 2012b. GripSense: Using built-in sensors to detect hand posture and pressure on commodity mobile phones. In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology (UIST ’12). Copyright © 2012 ACM. Used with permission.


Скачать книгу
Яндекс.Метрика