Preliminary Emotion-Based Model for Realistic 3D Animation

01021


Introduction
Starting from 2018, Malaysia Digital Economy Corporation (MDEC) highlighted the dominance of 3D animation over 2D animation, indicating a surge in industry demand [1].Despite technological advancements, only 19% of animation companies' major workforce comprises animators, creating a strain on resources.The widespread application of 3D technology across industries [2], coupled with a burgeoning global 3D animation market, valued at USD 11.46 billion in 2016, underscores the sector's rapid expansion [3].Emphasizing the need to portray various emotions in animation stories, reflecting human experiences, the existing literature lacks comprehensive character-building guidelines.Consequently, our study aims to examine character movements corresponding to different emotions, ultimately formulating guidelines for realistic character animation in response to the growing demands of the dynamic animation industry.
Previous studies assert that six universal emotions-enjoyment, anger, fear, sadness, disgust, and surprise-affect every individual globally [4].Most animation stories incorporate these fundamental feelings, emphasizing the need for approximate realism in 3D animation.A slight error in character portrayal can disrupt scenes, leaving the audience disconnected.While many 3D animation stories offer exemplary models, novice animators often overlook crucial details, particularly character movements during emotional scenes.Furthermore, another study underscores that even minor animation discrepancies can render it implausible, given its significant role in the virtual reality domain [5].
This study is motivated to aid prospective animators in enhancing animation narratives, benefiting their respective countries' animation sectors.Growing interest in 3D animated stories, observed in popular productions like Frozen and Inside Out, underscores current market trends.The proposed model functions as a valuable tool for future animators, facilitating optimization of character movement and the establishment of a compelling connection between animated characters and audience emotions.
This paper introduces a preliminary emotion-based model for animated characters, grounded in six basic emotions.Employing an empirical approach, we investigate the impact of emotions on human perception, applying findings to 3D animation.Our experiments involve exposing emotions to enhance audience engagement.Subsequently, we analyze specific emotions' movement and correlate our observations with existing literature on human responses to emotions.The proposed model not only serves as a guide for future animators but also lays the potential groundwork for developing specialized machine intelligence.This potential application of artificial intelligence can aid in producing more realistic and expressive animations, enhancing the overall quality of animated storytelling.
This paper is organized as follows.Section 2 reviews literature related to character building principles and basic emotions, culminating in a conceptual framework.Section 3 details our methodology, encompassing experiment design, data collection, and analysis.In Section 4, we present results, discuss findings, and introduce our preliminary model.Section 5 concludes our study.

Background
In this section, we firstly discuss the principles of animation, followed by category of character animation, basic emotions and lastly the conceptual model of this study.

Principles of Animation
This subsection introduces the principle of animation based on the work of Shapiro's Building a Character Animation System [6] and Lasseter's Principles of Traditional Animation Applied to 3D Computer Animation [7].
Building a Character Animation System.In his work [6], Shapiro articulated that to animate a virtual character for simulations and games, there are numerous significant parts of character modelling, incorporates locomotion, facial animation, speech synthesis, reaching/grabbing, and different mechanized non-verbal practices, for example, nodding, gesturing and eye saccades.This study considers Shapiro's work as the main guidance due to its comprehensiveness in covering many ranges of ani-mation controllers in character animation.The presented work covers a total of 15 controllers for manipulating the element of character movement from top to toe as summarized in Table 1.
This research, focusing on character building, necessitates a review of other journals pertaining to realistic animation characters.Another study [8] contends that, in addressing the challenge of producing realistic human-character movement, animation software should not only facilitate reasonable development but also provide full control of the process to the animator.Lasseter's work laid the foundations for realistic animation, emphasizing traditional methods applied to 3D animation [7], primarily involving objects and non-human characters.Given the focus on human character development in this study, these principles may be irrelevant but related.With eleven items, these principles, covering aspects like squash and stretch, timing, and staging, seem to only scratch the surface, lacking focus on character development.Timing, for instance, depends on the object's weight and size, limiting its usefulness for this study.In additon, an industrial source [9] underscores the importance of proper timing, as inappropriate use can lead to unpredictable shots and poorly planned events, diminishing emotional impact.In this study, emotion serves as the timing element, determining how characters react based on emotional cues.Furthermore, a prior study [10] categorizes character animation into four areas: character design, character acting, characterization, and plot, and character affordance.Each category has distinct meanings and focuses.
Character design involves the motivation, form, and psychology behind a character's design, impacting audience emotion.Character acting pertains to designing movement, expression, gesture, and voice.Character affordance deals with rigging, including sound and capabilities.Lastly, characterization and plot concentrate on designing inter-character relationships within the story-world.

Human Basic Emotions
This study focuses on basic emotions, integral to every animation story, conveying at least one emotion in its narrative.A previous work on generative agents [11] discussed that emotions Over the past century, researchers have debated the concept of basic emotions.In 1872, Charles Darwin published 'The Expression of the Emotions in Man and Animals,' identifying over 30 emotions grouped into seven categories.Subsequent studies have proposed various classifications, yet no consensus has emerged regarding the precise number or nature of fundamental emotions [12].
Early research identified six fundamental emotions universally experienced by humans: joy, sadness, disgust, anger, fear, and surprise [13].These emotions can combine, yielding more intricate or mixed feelings like love.Joy is typically conveyed through a smiling face, relaxed body language, and an upbeat tone of voice.Associated with marital satisfaction and longevity, joy contributes to both physical and mental well-being.In contrast, sadness is linked to adverse health outcomes, including stress, loneliness, anxiety, and depression.Expressed through grief, hopelessness, and dampened mood, sadness may manifest as quietness or crying, accompanied by low energy and negative thoughts.Fear, crucial for survival, triggers a physiological response, including tense muscles, increased heart rate, and heightened alertness.Widening eyes and pulling back the chin signify fear, prompting actions like hiding or fleeing.Disgust is displayed through physical reactions like turning away, vomiting, or wrinkling the nose, often triggered by unpleasant tastes, sights, or smells.Anger, a powerful emotion stemming from frustration, is expressed through facial features, body language such as turning away, and a gruff tone of voice.Aggressive behaviors and physiological responses like sweating accompany anger.Surprise, akin to fear, activates a fight response, marked by widened eyes, raised brows, open mouth, and physical reactions like jumping back.Verbal responses include yelling, gasping, or screaming.Understanding these universal emotions and their expressions is pivotal for comprehending human responses in various situations.
The selection of the six basic emotions is grounded in consensus among researchers, although some later chose to challenge this agreement.A study exemplify this adherence to the six basic emotions in their study on the autonomic nervous system [14], affirming their relevance.Consequently, the six basic emotions prove instrumental for studies in this domain.In summary, there is a recognized necessity to formulate a guideline for character animation rooted in basic emotions.This guideline will aid future animators in discerning the crucial elements to emphasize, minimizing errors in effectively conveying emotions to the audience.
Based on the literature review, we have distilled that crafting a realistic character animation involves addressing two essential factors: character movement and facial expression, depicted in Figure 1.This study aligns the animated movement elements from Shapiro's controllers with six basic emotions to animate expressions.Our findings indicate that distinct emotions entail unique movements, extending beyond facial expressions.

Methodology
This section outlines our study's methodology, covering research design, experiment setup, and data analysis.Our focus is on exploring the impact of controllers on expressing basic emotions in character animation, drawing on methods from another relevant work [15].Experimental design was chosen for its effectiveness in assessing the viability and impact of programs.
We employed Google Form to distribute questionnaires for participant recruitment, pilot testing, and both experiments, streamlining response gathering.The questionnaires aimed to gauge participants' understanding of animation controllers and their direct influence on expressing basic emotions.
Our initial pilot experiment involved 40 participants, but logistical challenges and identified questionnaire limitations led to revisions.Following refinements, we conducted a real experiment with 30 participants who watched the animation movie 'Inside Out.' Participants, possessing a moderate understanding of animation controllers and basic emotions, received a briefing before exposure to the animation.Subsequently, a second experiment was conducted to validate the results obtained from the preliminary model.
We analyzed participant agreement percentages, correlating averages for each emotion by controllers.This data was cross-referenced with existing literature on emotion and human response, forming our model and ensuring a comprehensive interpretation of the results.

Results and Discussions
In this section, we present and discuss the result and analysis for the first experiment, second experiment and finally the formulated preliminary guiding model.

Results for First Experiment
The first experiment requires participants to select a scale from one to five, indicating their agreement with the presented animation controllers and their association.Table 2 displays the average results of the experiment.
As all controllers for each emotion have a maximum value of 5, the differences can be discerned by calculating the average.The minimum chosen average for a controller is 4, serving as the threshold to determine its association with the emotion.This decision aligns with the Likert-scale values: 1 = Strongly Disagree, 2 = Disagree, 3 = Neutral, 4 = Agree, and 5 = Strongly Agree.In practical terms, if the mean falls below 4, the controller is excluded from consideration for the associated emotion.The subsequent figures illustrate the breakdown of this table for each emotion.
Figure 2  arm gestures toward objects, and grabbing pertains to hand control for touching or picking up items.Gaze, eye saccades, head, and face govern eye and body movements.Joy's dynamic nature necessitates multiple controllers, excluding blinking, constraint, breathing, and general parameters, as the physical movement takes precedence.In Figure 3, the average controllers for sadness include world offset, idle motion, grab, gaze, blink, head, and face.Excluding locomotion, animation, and reach is deliberate, aligning with the slowed or still movements characteristic of grieving individuals.Notably, idle motion, like wind-blown hair, amplifies the character's despondency, immersing the audience in their emotional world.Focus on head, face, blink, and gaze becomes crucial in sadness, given the minimal physical activity.Omitting override and constraint emphasizes the signifi-Fig.2: Average controllers for joy.
cance of upper-body expressions, and grab, symbolizing shielding the face, aptly contributes to conveying sadness.
In Figure 4, fear is represented by controllers: world offset, idle motion, locomotion, reach, grab, gaze, breathing, eye saccades, head, face, and override.Fear prompts fleeing, justifying the inclusion of world offset, idle motion, locomotion, and override.Reach and grab signify protective actions against perceived threats.Breathing, eye saccades, head, and face manifest wide-eyed expressions during escape.Blinking is excluded, as wide-eyed fear minimizes eye closures.General parameters, reflecting non-skeleton data, remain unchanged during fear, aligning with the focus on heightened skeletal expressions capturing the essence of the emotion.
Figure 5 reveals anger's controllers: locomotion, reach, grab, gaze, breathing, eye saccades, head, face, and general parameters.Anger arises when characters lose patience with the situation, leading to actions like searching for objects to vent frustration.Locomotion, reach, and grab exemplify this reactive behavior.Anger is manifested through erratic breathing, intense gaze, and red facial complexion, necessitating general parameters.Animation and override are excluded, as anger prompts movement away from a fixed spot, making these controllers irrelevant to the expression of this emotion.
Figure 6 depicts the average controllers for surprise: gaze, eye saccades, and face.Surprise primarily manifests through facial changes-wide-eyed expression with slight forehead wrinkles.Unlike fear prompting flight, surprise captures the initial shock.Consequently, locomotion is omitted, while animation is relevant for slight body movement.Other controllers are unnecessary, as a surprise moment lasts a brief second, and participants may not have consciously perceived additional changes.The emphasis on facial expressions aligns with the instantaneous nature of the surprise emotion.
Based on Figure 7, controllers for disgust include idle motion, locomotion, gaze, and face.Disgust, stemming from unpleasant or offensive stimuli, prompts actions like running away (locomotion).Gaze and facial expressions dominate, contributing to idle motion.Limited animation and head movement may stem from participant confusion regarding the timing of the emotion's occurrence.Overall, disgust is conveyed predominantly through subtle body   language and facial reactions, with participants possibly facing challenges in recognizing nuanced changes during the emotion.

Results for Second Experiment
Table 3 presents the average outcomes from the second experiment involving 15 participants.Deviating from prior suggestions of four to five testers, the larger participant pool enhances the study's robustness [16].Derived from the initial experiment, participants exhibited familiarity with controllers and emotions.The data, assessed on a scale from one (strong disagreement) to five (strong agreement), consistently exceeds four for all emotions.This collective agreement substantiates the study's findings, affirming the pivotal role of body language in character emotions.Notably, face and gaze controllers consistently surpass 4.0, emphasizing their universal significance in emotional expression, while other controllers vary by specific emotions.

Preliminary Basic Emotion-Based Model for Realistic 3D Animation
Our findings suggest a direct correlation between the level of realistic animation and the efficacy of emotion controllers, notably relying on face and gaze controllers for distinct facial movements.Building on our discoveries, we have formulated a preliminary model, as illustrated in Figure 8.Our model categorizes animation controllers along the axis of basic emotions, providing future animators with a guide to enhancing expression.For instance, to convey joy effectively, animators should prioritize controllers like world offset, idle motion, locomotion, animation, reach, grab, gaze, eye saccades, head, face, and potential overrides.This approach caters to the prolonged nature of happiness, influencing other characters.In contrast, expressions of surprise are typically brief.Our model suggests animators can focus on the character's gaze, eye saccades, and facial expression, conveying the emotion effectively without the need for extensive elements, allowing the camera to concentrate on the character's face.
However, with advancements in machine intelligence, animators could leverage automated assistance in implementing these controllers efficiently.The potential integration of AI  algorithms could streamline the animation process, allowing animators to focus on creative aspects.This pattern aligns with human responses to each emotion, as detailed in Sections 2 and 4. Our preliminary model serves as an initial step for future animators, and with the possibilities of machine intelligence, it could further enhance the efficiency and creativity in creating realistic character animations.Animators might benefit from automated assistance, reducing the need for extensive contemplation on which elements to prioritize.

Conclusions & Future Work
In this paper, we present a preliminary model guiding animators to animate characters based on six basic emotions, utilizing Shapiro's 15 controllers [6].Our findings suggest that this model offers time-saving convenience for animators, aiding in designing and focusing on enhancement elements during the animation process.It provides valuable supplementary knowledge for animators venturing into realistic animation development, creating expressive characters.However, our study solely delves into enhancing controllers for animated character movement, omitting software implementation details.Our scope concentrates on character building, excluding investigation into other production aspects like background music or environmental ambiance.To enhance our work, we plan to provide scenario examples and technical tutorials, aiding animators in creating realistic animated characters.While face controllers alone can convey emotion, integrating other controllers enhances audience immersion.Future experiments can explore character movement with controllers for each emotion.Additionally, we suggest investigating other potential enhancing elements, such as complex emotions, movements, and gestures.Considering the evolving landscape, the integration of machine intelligence could further streamline animation processes by automating certain aspects, potentially offering animators valuable assistance and creative possibilities.
We would like to thank the Universiti Malaysia Sabah for the general support in this work.

Fig. 8 :
Fig. 8: Preliminary model for character building based on basic emotions.
of character from other input devices, such as the Kinect.

Table 2 :
illustrates joy's eleven controllers: world offset, idle motion, locomotion, animation, reach, grab, gaze, eye saccades, head, face, and override.World offset signifies global direction and position, reflecting excitement through continuous movement.Idle motion captures subtle hair and clothing shifts during movement.Locomotion portrays cheerful walking, while animation encompasses body movements like jumping.Reaching involves Average of the agreement for each controller by each emotion.

Table 3 :
Scale results and average of the agreement with controllers during second test.