The Haptic Creature
The Robot: Software
Figure 1 depicts a high-level view of the Haptic Creature architecture, which is actually composed of two software systems. Low-level mechatronics control is handled by the microcontroller firmware. All other processes are handled by the host software. The two systems communicate through a specified protocol transmitted over USB 2.0.
The microcontroller firmware was written in C (MPLAB C for PIC18 v3.31); however, since its function is simply low-level motor control and sensor reading, its code comprises a very small portion of the robot’s software system.
The host software, on the other hand, encompasses the vast majority of the Haptic Creature software, which consists of 390 classes written in Java (v1.6.0). The host system was developed simultaneously on Gentoo Linux and Apple Mac OS X (v10.6) — and occasionally tested for compatibility on Microsoft Windows XP. Due to the portability of the Java Virtual Machine (JVM), no special modifications were necessary to run on any of these operating systems.
Figure 2 presents an overview of the primary classes of the host software system. This system is divided into several layers, each of which is categorized as either behavioral or mechatronic.
The Central Nervous System (CNS) layer constitutes the Haptic Creature’s high-level behavior. A Scheduler manages the execution of the Recognizer, Emoter, and Renderer, which allows each to have an execution frequency independent of the other. To simplify debugging of the current implementation, however, the update rate was set the same for all components, 30Hz, which was the required rate of the highest-frequency class, the Recognizer. Any class called to execute when no work was required resulted in a NOP (No Operation), so this approach incurred little additional overhead.
The Physical Abstraction layer provides a sensing and actuation interface that separates the Haptic Creature’s behavior — specifically, the Recognizer and Renderer — from its mechatronics. For example, as will be described in greater detail in Physical Renderer, the Renderer manipulates an ear abstractly through a volume parameter rather than directly with a servo position. This has the advantage of allowing the mechatronics of the ear to change — e.g., substituting a motor for the servo — without any need to modify the Renderer class.
The remaining two layers comprise the low-level sensing and actuation framework for the host software system. The Transducer Bridge layer provides abstract representations of the Haptic Creature’s transducers, thereby presenting a uniform interface to each specific transducer type — currently, Accelerometer, Motor, PressureSensorMesh, and Servo. The Transducer Implementation layer then provides the corresponding implementations specific to a particular mechatronics platform. This framework decouples the classes in the Physical Abstraction layer from the low-level implementation of each transducer, thereby allowing the underlying implementation to vary without affecting other parts of the system. Currently, the robot’s low-level mechatronics are managed solely by the PIC microcontroller platform; however, this framework easily affords the swapping and even intermixing of a variety of alternate low-level solutions.
Sensing
The Sensing component, as the name implies, handles those aspects of the robot that deal with sensing information from the real world. Specifically, it interfaces with the touch and movement sensors via the control hardware (see Communication and Control). This component does little interpretation of the data, save simple filtering and normalization.
The Skin class (Physical Abstraction layer) represents the entirety of the current sensing infrastructure and is composed of two classes from the Transducer Bridge layer. The PressureSensorMesh encapsulates the touch sensor data, which is normalized within the range [0, 1023] and referenced via a row and column index. The Accelerometer encapsulates the movement data, which is normalized within the range [-512, 512] and referenced via an axis index.
Gesture Recognizer
The Gesture Recognizer component queries the Sensing component and constructs an initial model of the physical data. Its function is to manage the variety of sensor information so as to provide a cohesive view. One example would be the array of pressure sensors that, when monitored, allowed determination of direction and speed of movement along with pressure intensity.
The Gesture Recognizer component, in turn, builds a higher-order model of the input data. An example would be distinguishing between a moderate stroke and a firm massage. Both require monitoring the direction, speed, and pressure intensity across a range of sensors; however, this component also interprets these values such that an evaluation of the intention of the user can be determined.
A functioning version of the Gesture Recognizer component was not crucial to my thesis, because it was possible to conduct the related study simulating its capabilities. Furthermore, a fully functioning version would have been a major undertaking that was beyond the scope of my thesis, so I implemented only the infrastructure as a placeholder for future work.
The Recognizer class (Central Nervous System layer) represents the host software for the Gesture Recognizer component. At present, this class manages the interface to the Sensing component, so it can query for sensor data. However, the Renderer class does not apply any additional processing beyond the ability to record the data in an external file for development and testing purposes.
Emoter
The Emoter component represents the underlying emotional state of the Haptic Creature. This state is affected either externally through information from the Gesture Recognizer component or by means of its own internal mechanisms — e.g., temporal considerations. One example could be that a gentle stroke elicits a pleased state, then the Emoter component gradually decays into a neutral state shortly after this interaction ceases.
This component itself has no knowledge of the Gesture Recognizer implementation and only cursory knowledge of the Physical Renderer component (necessary for change notification). This allows the model to focus on the domain-specific information of the system without being directly concerned with how it is getting its information or how its state is being presented.
The Emoter class (Central Nervous System layer) represents the host software for the Emoter component. As the Gesture Recognizer component was not yet fully developed, the current implementation of the Emoter class was not affected by its inputs from the Gesture Recognizer component. Also, while the Emoterclass receives regular timing notifications from the Scheduler class, it does not yet implement temporal considerations. The current version of the Emoterclass focuses solely on the encapsulation of emotional state, which will be described in more detail next, and change notification thereof.
Affect Space
Discrete and dimensional models of emotion are the predominant theories in psychology. For the Haptic Creature, I chose to design its emotion model following from the dimensional approach, as it provided a straightforward framework with which to parameterize the robot’s behavior. Furthermore, precedent for this approach already exists within socially interactive robotics.
I designed the Haptic Creature’s emotion model in accordance with the two-dimensional, bipolar affect space adapted from Russell (Figure 3). Conceptually, the horizontal dimension describes the robot’s valence — unpleasant vs. pleasant — while the vertical dimension represents the robot’s arousal — deactivated vs. activated. Its current emotional state, therefore, is defined by specifying a point (v, a) in this affect space, where each dimension is within the range [-1.0, 1.0].
Physical Renderer
The Physical Renderer component is in charge of the higher-order, physical manifestation of the internal state of the Haptic Creature. This component listens for changes in the Emoter component, then translates the results into an orchestrated manipulation of the effectors. One example might be that when the robot moves into a pleased state, its breathing response adjusts to very soft, rhythmic in/out motions while it produces a similar “purr” that can be felt.
The Renderer class (Central Nervous System layer) represents the host software for the Physical Renderer component. This class provides two distinct functions: emotion transitioning and expression control.
When the Emoter component updates the Haptic Creature’s emotional state, the Renderer class must smoothly transition from the emotion actively being expressed to the new emotion. The speed at which this occurs is determined by the valenceReactiveness and arousalReactiveness properties, which specify the time (milliseconds) to transition for the respective affect dimensions. This functionality should not be confused with the temporal considerations presented with the Emoter component above. Rather, these properties exist simply to control how quickly the Renderer responds to changes in emotional state, thus ensuring organic physical transitions.
For a particular emotion, the Renderer class also must manage the physical expression. This is accomplished through a suite of software manipulators, one for each of the Haptic Creature’s effectors: EarManipulator, LungManipultor, and PurrBoxManipulator. Each manipulator is configured via rendering parameters, which will be detailed next.
Rendering Parameters
The manner in which the Haptic Creature displays a particular emotional state is described through a series of key expressions located at specific points in the affect space. A key expression provides a detailed description of the behavior in the form of specific values for each actuator’s rendering parameters. If the robot’s current emotional state does not coincide with a key expression, then the parameters are interpolated from nearby key expression. This interpolation also allows for tweening values so that the robot may smoothly transition from one emotional state to another. The individual rendering parameters used to define the behavior for each of the Haptic Creature’s actuators will be described here in turn.
Ears
The two ears can be controlled independently of each other in the single dimension of stiffness. They vary in firmness in a manner not visually perceptible but can be felt when the human squeezes them. Ear stiffness is specified by means of a volume parameter, which ranges from 0% (limp) to 100% (stiff).
Lungs
The Haptic Creature’s lungs modulate its manner of breathing through four parameters. Rate is defined as breaths-per-minute (bpm). Bias controls the symmetry of each breath by specifying the percentage that is dedicated to the inhalation phase, from 0% (all exhale) to 100% (all inhale) — for example, a bias of 25% would allocate 1/4 of each breath to the inhale and 3/4 to the exhale. Rest (milliseconds) allows for a pause at the end of inhalation and/or exhalation for each breath, and is defined independently for each. Volume defines the minimum and maximum position for each breath.
Purr Box
The Haptic Creature’s purr box controls the presentation of a modulated vibrotactile purr. Waveform determines the type of wave generated: pulse, sawtooth, reverse sawtooth, sine, triangle, or null. On duration and off duration (milliseconds) define the wave’s duty cycle. Amplitude, specified as percentages from 0% to 100%, define the wave’s minimum and maximum amplitude.
Actuation
The Actuation component is tightly coupled with the Physical Renderer component and is charged with directly controlling the robot’s effectors. Specifically, this component interfaces with the various motors via the control hardware (see Communication and Control). It does little interpretation of the information, save adjusting normalized data appropriately for the individual hardware devices.
The Ear, Lung, and PurrBox classes (Physical Abstraction layer) comprise the current actuation infrastructure. Each of these classes encapsulates an appropriate actuator abstraction from the Transducer Bridge layer: Ear→Servo, Lung→Servo, PurrBox→Motor.
The Servo class controls the position of a servo motor via an angle property ([0.0, 180.0]). The Motor class controls the speed of a motor via a speedproperty ([0.0, 1.0]) and the direction of rotation via a rotation property (CW, CCW). The current implementation of the control hardware, however, does not allow for specification of rotation, so this property is unused at present. The Purr Box is the only hardware currently controlled through the Motor class and, at present, does not have need of the rotation property.