Tiago Ribeiro > Selected Publications

Selected Publications

Selected Publications

PhD Thesis:
Creating The Illusion of Life in Autonomous Social Robots


Ribeiro, T., 2020. Creating the Illusion of Life in Autonomous Social Robots. PhD Thesis. Instituto Superior Técnico, University of Lisbon, Portugal.

Robots are becoming a new form of animated characters and are being deployed into our society to be used in various social settings that can benefit of the use of technology and artificial intelligence (AI), such as education, entertainment or assisted living.
This thesis explores how such social robots, through their physically expressive embodiment, and considering their autonomous capabilities, may be able to convey the illusion of life just as movie characters do, while interacting with humans.
In particular, we are interested in bringing in theories in practices from the field of character animation, and to develop methods and technology that will allow animation artists to take a structural role on the development of autonomous social robots.
We establish and describe a new form of animation, called \textit{robot animation}, which sets to bring the existing knowledge and techniques from traditional and CGI animation, into the field of social robots.
Along it we have outlined a list of principles of robot animation, based on the original principles of animation from Disney.
The SERA model and methodology was created to support the creation of autonomous socially expressive robots, which relies in user-centred design and includes non-technical experts such as psychologists and character animators.
An innovative animation engine called Nutty Tracks was created to support the blending, during interactions, of animations and postures pre-designed by artists, with motion that is procedurally generated in real-time.
Nutty Tracks bridges the symbolic level of an autonomous artificial intelligence agent, with the lower level of motion generation and control.
This allows us to create autonomous social robots that can convey the illusion of life, in a way that users are also able to understand its communicative intentions.
In order to support complex, articulated robots such as industrial manipulators, we have created ERIK, which is an expressive kinematics technique that acts by bring together inverse kinematics (IK) control with forward kinematics (FK) control.
We add to that the Nutty Motion Filter, which is a signal processor that allows to interpolate and smooth a motion signal in real-time in order to comply with mechanical and kinematic limits of joints or spatial motion, while providing parameters that allow to tweak the expressivity of the resulting motion.
Various use-cases are presented using different robots.
In particular, the Ahoy interactive scenario was developed in which humans play a game of pantomime with the custom-built Adelino robot.
The robot could use its expressive posture to convey hints to the players, while keeping an orientation constraint towards their face.
Similarly, the AvantSatie scenario was created in which Adelino acts autonomously as a piano-game companion.
User studies showed that the participants were able to decode the intention of the robot even if the ERIK solution, running in real-time, was slightly distorting the pre-designed postures in order to solve simultaneously for both orientational and postural goals.
The results provide evidence that expressive postures (controlled using FK) could be used along with IK in order to provide arbitrary robots with an animation model that works out-of-the-box, with nearly no tweaking.
In the long term, our theories, methods and technique pose as the foundation towards autonomous expressive robots that exhibit the illusion of life through an artist-enabled approach, while interacting with humans and their surrounding environment.

The Illusion of Robotic Life: Principles and Practices of Animation for Robots

Tiago Ribeiro & Ana Paiva. 2012

ACM/IEEE International Conference on Human-Robot Interaction ’12, Boston, MA, USA – Best Paper Nominee

This paper describes our approach on the development of the expression of emotions on a robot with constrained facial expressions. We adapted principles and practices of animation from Disney™ and other animators for robots, and applied them on the development of emotional expressions for the EMYS robot. Our work shows that applying animation principles to robots is beneficial for human understanding of the robots’ emotions.

This paper may be slightly outdated!
Please read the 2019 arXiv report below which contains an updated version of the Principles.
Please cite my thesis as it fully contains the arXiv report content!

Nutty-based Robot Animation: Principles and Practices

2019 arXiv report.

Works as a summary of my PhD thesis, and also contains an updated version of the Principles of Robot Animation.

Robot animation is a new form of character animation that extends the traditional process by allowing the animated motion to become more interactive and adaptable during interaction with users in real-world settings. This paper reviews how this new type of character animation has evolved and been shaped from character animation principles and practices. We outline some new paradigms that aim at allowing character animators to become robot animators, and to properly take part in the development of social robots. One such paradigm consists of the 12 principles of robot animation, which describes general concepts that both animators and robot developers should consider in order to properly understand each other. We also introduce the concept of Kinematronics, for specifying the controllable and programmable expressive abilities of robots, and the Nutty Workflow and Pipeline. The Nutty Pipeline introduces the concept of the Programmable Robot Animation Engine, which allows to generate, compose and blend various types of animation sources into a final, interaction-enabled motion that can be rendered on robots in real-time during real-world interactions. The Nutty Motion Filter is described and exemplified as a technique that allows an open-loop motion controller to apply physical limits to the motion while still allowing to tweak the shape and expressivity of the resulting motion. Additionally, we describe some types of tools that can be developed and integrated into Nutty-based workflows and pipelines, which allow animation artists to perform an integral part of the expressive behaviour development within social robots, and thus to evolve from standard (3D) character animators, towards a full-stack type of robot animators.

Although you can directly cite this report, its contents are fully included in my PhD thesis. Therefore I recommend that you cite my thesis instead!

Ribeiro, T., 2020. Creating the Illusion of Life in Autonomous Social Robots. PhD Thesis. Instituto Superior Técnico, University of Lisbon, Portugal.

ERIK: Expressive Inverse Kinematics Solving in Real-time for Virtual and Robotic Interactive Characters

With new advancements in interaction techniques, character animation also requires new methods, to support fields such as robotics, and VR/AR. Interactive characters in such fields are becoming driven by AI which opens up the possibility of non-linear and open-ended narratives that may even include interaction with the real, physical world. This paper presents and describes ERIK, an expressive inverse kinematics technique aimed at such applications. Our technique allows an arbitrary kinematic chain, such as an arm, snake, or robotic manipulator, to exhibit an expressive posture while aiming its end-point towards a given target orientation. The technique runs in interactive-time and does not require any pre-processing step such as e.g. training in machine learning techniques, in order to support new embodiments or new postures. That allows it to be integrated in an artist-friendly workflow, bringing artists closer to the development of such AI-driven expressive characters, by allowing them to use their typical animation tools of choice, and to properly pre-visualize the animation during design-time, even on a real robot. The full algorithmic specification is presented and described so that it can be implemented and used throughout the communities of the various fields we address. We demonstrate ERIK on different virtual kinematic structures, and also on a low-fidelity robot that was crafted using wood and hobby-grade servos, to show how well the technique performs even on a low-grade robot. Our evaluation shows how well the technique performs, i.e., how well the character is able to point at the target orientation, while minimally disrupting its target expressive posture, and respecting its mechanical rotation limits.

Nutty Tracks: Symbolic Animation Pipeline for Expressive Robotics

Tiago Ribeiro, Doug Dooley & Ana Paiva. 2013

ACM SIGGRAPH ’13 Posters, Anaheim , CA, USA – Student Competition Finalist

NuttyTracks is a symbolic real-time animation system for animating any robotic character using animation tools commonly used by professional animators. Our system brings artists and programmers closer to each other in the quest for creating the illusion of life in robotic characters.

The SERA Ecosystem: Socially Expressive Robotics Architecture for Autonomous Human-Robot Interaction

Tiago Ribeiro, André Pereira, Eugenio Di Tullio & Ana Paiva. 2016

AAAI Spring Symposium on Enabling Computing Research in Socially Intelligent Human-Robot Interaction, Stanford University, Palo Alto, CA, USA

Based on the development of several different human-robot interaction (HRI) scenarios using different robots, we have been establishing the SERA ecosystem. SERA is composed of both a model and tools for integrating an AI agent with a robotic embodiment, in HRI scenarios. We present the model, and several of the reusable tools that were developed, namely Thalamus, Skene and Nutty Tracks. Finally we exemplify how such tools and model have been used and integrated in five different HRI scenarios using the NAO, Keepon and EMYS robots.

Emotion Modelling for Social Robots

Ana Paiva, Iolanda Leite & Tiago Ribeiro. 2014

Book Chapter in The Oxford Handbook of Affective Computing

Editors: Rafael A. Calvo, Sidney D’Mello, Jonathan Gratch, Arvid Kappas
Oxford University Press, Dec 2, 2014. ISBN: 0199942234, 9780199942237

This chapter describes current advances in emotion modeling for social robots. It begins by contextualizing the role of emotions in social robots, considering the concept of the affective loop. It describes a number of elements for the synthesis and expression of emotions through robotic embodiments and provides an overview of emotional adaptation and empathy in social robots.

Creating Interactive Robotic Characters: Through a combination of artificial intelligence and professional animation

Tiago Ribeiro & Ana Paiva. 2015

ACM/IEEE International Conference On Human-Robot Interaction ’15, HRI Pioneers Workshop, Portland, OR, USA

We are integrating artificial intelligent agents with generic animation systems in order to provide socially interactive robots with expressive behavior defined by animation artists. Such animators will therefore be able to apply principles of traditional and 3D animation to such robotic systems, and thus allow to achieve the illusion of life in robots. Our work requires studies and interactive scenario development alongside with the artists.