Topics

  Visualisation of research topics Copyright: © iTec

Human–Robot Mixed Groups

In the future, robots are supposed to help us in different application fields. Be it in production, in health care or in the office – robots will need to work with groups of humans and will need to navigate within these social groups. Thus, we investigate processes of group formation and group dynamics in human-robot mixed groups.

Selected Publications

  • Abrams, A. M. H., & Rosenthal-von der Pütten, A. M. (2020). I–C–E Framework: Concepts for Group Dynamics Research in Human-Robot Interaction. International Journal of Social Robotics, 1-17. https://doi.org/10.1007/s12369-020-00642-z
  • Rosenthal-von der Pütten, A. M., & Abrams, A. M. H. (2020). Social Dynamics in Human-Robot Groups: Possible Consequences of Unequal Adaptation to Group Members Through Machine Learning in Human-Robot Groups. Poceedings of the 22nd International Conference on Human-Computer Interaction, Copenhagen, Denmark, 19-24 July 2020.

Artificial Social Cognition

Humans can analyse social situations in a split second and know exactly to behave in this situations. For robots this is a very difficult task. We investigate how we can enable robots to behave adequately in their social environment and how they can analyse this environment for this purpose.

Selected Publications

  • Rosenthal-von der Pütten, A. M., Lugrin, B., Steinhaeusser, S. C., & Klass, L. (2020). Context Matters! Identifying Social Context Factors and Assessing Their Relevance for a Socially Assistive Robot. Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction (pp. 409-411). https://doi.org/10.1145/3371382.3378370
  • Lugrin, B., Rosenthal-von der Pütten, A. M., & Hahn, S. (2019). Identifying Social Context Factors Relevant for a Robotic Elderly Assistant. International Conference on Social Robotics (pp. 558-567). Springer, Cham. https://doi.org/10.1007/978-3-030-35888-4_52

Emotions in Human-Robot Interaction

We investigate emotional effects in human-robot interaction. This includes whether humans interact empathetically with robots, how much emotions robots actually need, how artificial expressions of emotion are perceived by humans and whether and how emotion representations can improve interaction with robotic systems.

Selected Publications

  • Rosenthal-von der Pütten, A. M., Krämer, N. C., Hoffmann, L., Sobieraj, S., & Eimler, S. C. (2013). An experimental study on emotional reactions towards a robot. International Journal of Social Robotics, 5(1), 17-34. https://doi.org/10.1007/s12369-012-0173-8
  • Rosenthal-von der Pütten, A. M., Schulte, F. P., Eimler, S. C., Sobieraj, S., Hoffmann, L., Maderwald, S., Brand, M., & Krämer, N. C. (2014). Investigations on empathy towards humans and robots using fMRI. Computers in Human Behavior, 33, 201-212. https://doi.org/10.1016/j.chb.2014.01.004

Embodiment & Morphology

How artificially intelligent systems affect us and how we interact with them is determined by their appearance and embodiment. So what difference does it make when the AI stands in front of me virtually on a screen, or physically embodied as a robot? What influence do different forms of robots have on the interaction of humans with them?

Selected Publications

  • Hoffmann, L., Bock, N., & Rosenthal-von der Pütten, A. M. (2018). The Peculiarities of Robot Embodiment (EmCorp-Scale) Development, Validation and Initial Test of the Embodiment and Corporeality of Artificial Agents Scale. Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (pp. 370-378). https://doi.org/10.1145/3171221.3171242
  • Rosenthal-von der Pütten, A. M., Straßmann, C., & Krämer, N. C. (2016). Robots or agents–neither helps you more or less during second language acquisition. International conference on intelligent virtual agents (pp. 256-268). Springer, Cham. https://doi.org/10.1007/978-3-319-47665-0_23

Mental Models of and for Artificially Intelligent Systems

We deal with the question which mental models humans have about artificially intelligent systems. When do we see robots and virtual figures as intentional agents? How do these models influence interaction? Which factors lead to changes in our mental models?

Selected Publications

  • Rosenthal-von der Pütten, A. M., & Hoefinghoff, J. (2018). The more the merrier? effects of humanlike learning abilities on humans’ perception and evaluation of a robot. International Journal of Social Robotics, 10(4), 455-472. https://doi.org/10.1007/s12369-017-0445-4
  • Bock, N., & Rosenthal-von der Pütten, A. M. (2019). „I think“ or “I compute” – How to communicate internal processes to users and its effect on trust. In Edwards, C., Edwards, A., Kim, J., Spence, P. R., de Graaf, M., Nah., S., & Rosenthal-von der Pütten, A. M. (Chair), Human-Machine Communication: What Does/Could Communication Science Contribute to HRI? Workshop conducted at the International Conference on Human Computer Interaction 2019 in Daegu, Korea.

Effects of Nonverbal Behavior of Embodied Artificial Agents

Many virtual assistants and robots are based on the human form. That gives them the ability to express themselves not only verbally, but also non-verbally. We investigate the influence of nonverbal agent behaviour on the perception of these agents and the interaction with them.

Selected Publications

  • Rosenthal-von der Pütten, A. M., Straßmann, C., Yaghoubzadeh, R., Kopp, S., & Krämer, N. C. (2019). Dominant and submissive nonverbal behavior of virtual agents and its effects on evaluation and negotiation outcome in different age groups. Computers in Human Behavior, 90, 397-409. https://doi.org/10.1016/j.chb.2018.08.047
  • Rosenthal-von der Pütten, A. M., Krämer, N. C., & Herrmann, J. (2018). The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. International Journal of Social Robotics, 10(5), 569-582. https://doi.org/10.1007/s12369-018-0466-7

Methods & Scale Development

We contribute to the development of methodological tools for the investigation of human–technology interaction. This includes, for example, the development of suitable measuring instruments.

Selected Publications

  • Rosenthal-von der Pütten, A. M., & Bock, N. (2018). Development and Validation of the Self-Efficacy in Human-Robot-Interaction Scale (SE-HRI). ACM Transactions on Human-Robot Interaction (THRI), 7(3), 1-30. https://doi.org/10.1145/3139352
  • Rosenthal-von der Pütten, A. M., Bock, N., & Brockmann, K. (2017). Not your cup of tea? How interacting with a robot can increase perceived self-efficacy in HRI and evaluation. 2017 12th ACM/IEEE International Conference on Human-Robot Interaction HRI (pp. 483-492). IEEE. https://doi.org/10.1145/2909824.3020251
  • Hoffmann, L., Bock, N., & Rosenthal-von der Pütten, A. M. (2018). The Peculiarities of Robot Embodiment (EmCorp-Scale) Development, Validation and Initial Test of the Embodiment and Corporeality of Artificial Agents Scale. Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (pp. 370-378). https://doi.org/10.1145/3171221.3171242

Uncanny Valley Hypothesis

The Uncanny Valley hypothesis is widely used as an explanatory model for negative reactions to robots, although it is only an unproven thought model. Our research is therefore devoted to the empirical investigation of the theory. The research includes explorative behavioral analysis of human–robot interactions with androids in the field, studies to evaluate the appearance of robots, and studies to test explanatory approaches of the Uncanny Valley.

Selected Publications

  • Rosenthal-von der Pütten, A. M. (2014). Uncannily Human. Empirical Investigation of the Uncanny Valley Phenomenon (Doctoral dissertation). Available at: http://duepublico.uni-duisburg-essen.de/servlets/DocumentServlet?id=33254.
  • Rosenthal-von der Pütten, A. M., & Krämer, N. C. (2015). Individuals’ evaluations of and attitudes towards potentially uncanny robots. International Journal of Social Robotics, 7(5), 799-824. https://doi.org/10.1007/s12369-015-0321-z
  • Rosenthal-von der Pütten, A. M., & Krämer, N. C. (2014). How design characteristics of robots determine evaluation and uncanny valley related responses. Computers in Human Behavior, 36, 422-439. https://doi.org/10.1016/j.chb.2014.03.066
  • Rosenthal-von der Pütten, A. M., Krämer, N. C., Becker-Asano, C., Ogawa, K., Nishio, S., & Ishiguro, H. (2014). The uncanny in the wild. analysis of unscripted human–android interaction in the field. International Journal of Social Robotics, 6(1), 67-83. https://doi.org/10.1007/s12369-013-0198-7

Linguistic Alignment with Artificial Agents

We investigate under which circumstances, to what extent, and why humans adapt to artificially intelligent systems and which social effects arise when a system adapts to humans.

Selected Publications

  • Klatt, J., Rosenthal-von der Pütten, A. M., Hoffmann, L., & Krämer, N. C. (2013). A Thousand Words Paint a Picture-Examining Interviewer Effects with Virtual Agents. Intelligent Virtual Agents: 13th International Conference, IVA 2013, Edinburgh, UK, August 29-31, 2013, Proceedings (Vol. 8108, p. 452). Springer.
  • Kühne, V., Rosenthal-von der Pütten, A. M., & Krämer, N. C. (2013). Using linguistic alignment to enhance learning experience with pedagogical agents: the special case of dialect. International Workshop on Intelligent Virtual Agents (pp. 149-158). Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-40415-3_13
  • Rosenthal-von der Pütten, A. M., Hoffmann, L., Klatt, J., & Krämer, N. C. (2011). Quid pro quo? Reciprocal self-disclosure and communicative accomodation towards a virtual interviewer. International Workshop on Intelligent Virtual Agents (pp. 183-194). Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-23974-8_20
  • Rosenthal-von der Pütten, A. M., Straßmann, C., & Krämer, N. C. (2016). Linguistic alignment with artificial entities in the context of second language acquisition. CogSci.
  • Rosenthal-von der Pütten, A. M., Straßmann, C., & Krämer, N. C. (2016). Robots or agents–neither helps you more or less during second language acquisition. International conference on intelligent virtual agents (pp. 256-268). Springer, Cham. https://doi.org/10.1007/978-3-319-47665-0_23
  • Rosenthal-von der Pütten, A. M., Wiering, L., & Krämer, N. (2013). Great minds think alike. Experimental study on lexical alignment in human-agent interaction. i-com Zeitschrift für interaktive und kooperative Medien, 12(1), 32-38. https://doi.org/10.1524/icom.2013.0005