Early Explorations using KNN to Classify Emotions in Virtual Reality based on Heart Rate (HR) and Electrodermography (EDG)

. To detect multimodal emotions using Virtual Reality (VR), this research demonstrates the findings and results of using a KNN Classifier by merging Heart Rate and Electrodermography signals. The participants in the study were shown 360-degree videos using a VR headset to elicit their emotional reactions. A wearable that measures skin activity and pulse rate in real time was used to record their emotional response. The experiment had a total of 30 participants, and the KNN classifier was used to classify intra-subject data. With the HR combined with EDG signals paired with KNN as the classifier, the study's 30 participants' data went through intra-subject classification where 11 out of 30 participants achieved a peak accuracy of 100%. These findings show that by combining HR and EDG signals, KNN may be used as the classifier to produce highly accurate results. This study's possible applications include VR rehabilitation, gaming, and entertainment.


Introduction
The emotion of humans is employed frequently in daily life while dealing with situations, such as making decisions, taking action, and talking with other people.Recently, there has been a surge in studies on affective computing's use of affect detection to identify emotions.Their emotion can be classified by using an individual's heart rate (HR), electrodermography (EDG) also known as the Galvanic Skin Response (GSR) which measures their psychophysiological signals.This paper's major goal is to highlight the method of multiclass emotion classification by HR combined with EDG data along with KNN classifier by using Virtual Reality (VR) to act as the stimuli to evoke emotion from the experiment's participants.Researchers are increasingly using 360-degree movies in VR to elicit emotional responses from their participants [1,2].Next is a discussion about the use of VR as the stimuli, HR Signals and EDG Signals.

Virtual Reality (VR) as a Stimuli
A headset such as Virtual reality (VR) allows users to experience a 360-degree view of their surroundings.This is especially true for 360-degree videos, which make it easier for users to become completely immersed in the scene.Throughout the investigation, the participants are free to express their feelings in a wide range thanks to the Virtual Reality (VR) experience.Figure 1 shows the participants wearing the VR headset to evoke their emotion based on the 360 videos shown.The use of virtual reality in emotion-related research has grown in the treatment of people with mental diseases.VR is used to treat several mental health conditions, including post-traumatic stress disorder, flight fear, and Claustrophobia [2].Electrodermography (EDG) merged with heart rate (HR) are the primary signals collected from the participants for this investigation, with VR serving as the stimuli.During the experiment, while wearing the VR headset and watching 360-degree films, the participant's hands were equipped with a wearable that was linked to collect the two signals.Next is the brief discussion of HR Signal which is one of the two signals collected.

HR Signal
The signals from the Heart Rate (HR) primarily capture the participants' heart rates in realtime, detecting their heart activity to track how they respond to 360-degree video stimuli.When it comes to studies involving emotions, HR signals are employed to identify emotions.Photoplethysmography (PPG) is used to monitor and display the participant's blood volume pulse (BVP) before converting it into HR signals [3,4,5], is used to collect HR signals from the participants.

EDG Signal
Skin conductivity and electrical activity of the individual were monitored using electrodermography (EDG).Skin conductivity is measured using EDG in micro siemens (S).EDG signals are great for determining the participant's emotion since they identify the sympathetic nervous system of the subject, which aids in awakening their autonomic nervous system.The participant's sympathetic nervous system activity from their sweat glands is used to identify the EDG signals [6].
The next few sections will discuss the methodology and equipment used for signal acquisition in Section 2 as well as methods for obtaining EDG and HR signals.The results of the intrasubject classification with thirty participants are discussed in Section 3, which used K-Nearest Neighbour (KNN) in a four-class emotion classification and HR merged with EDG as the data.After Section 4, the paper's conclusion is discussed.

Four-Quadrant Emotion Model
There are two basic types of emotions: positive and negative.Positive feelings can improve a person's life, whereas negative emotions can deteriorate a person's mental state of health and, if left untreated may result in depression [7].Valence and arousal are added into the four-class emotion model which increases the emotions attributes in comparison to a model that is two-class, primarily emphasizing on emotions that are negative and emotions that are positive [7].Valence signifies the level of emotion that is negative, while arousal signifies the level of emotion that is positive.
The circumplex of emotion by Russell's is a widely used model for classifying emotions [7].
Where arousal and valence work together to produce a bi-dimensional perspective.Figure 2 shows a two-dimensional model that is based on valence and arousal and refers to Russell's circumplex of emotion.Russell's scale is frequently used for valence-arousal in the field of emotional studies [7].One of the reasons Russell's model is so popular is because of its success in classification of the emotions into positive or negative emotions in a two-dimensional model.The graphic depicts a four-quadrant model with four classes, each of which represents an emotion that ranges from arousal that is low to arousal that is high and a valence that is positive to valence that is negative.With happy emotion characterized by positive valence and high arousal.
While furious emotions are characterized by valence that is negative and arousal that is high, bored emotions are characterized by valence that is negative and arousal that is low, and calm emotions are characterized by valence that is positive and arousal that is low.The next subsection describes the experimental setup phases.

VR as Stimuli
Virtual reality (VR) was chosen for its capability to evoke the participant's emotions where they can observe their surroundings in 360 degrees through the videos shown.The model depicted in Figure 2 serves as the foundation for the stimuli shown in the 360 videos utilized to elicit the emotions of the participants.There are a total of four emotion quadrants, first quadrant features clips of calm, relaxed, and happy emotions.Videos of joyful and ecstatic feelings are displayed in the second quadrant.While the third quadrant displays clips depicting various emotions including fear, rage, and distress.Videos with depressing and gloomy sentiments are displayed in the last and final quadrant.To reset the participants emotional baseline, a 10-second blank screen is shown after each quadrant is displayed.Next, the HR merged with EDG signals wearable used to collect are discussed.

Wearable and Feature Extraction
The main tool utilized to gather and obtain the subject's Electrodermography (EDG) and the activity of their Heart Rate (HR) is the Empatica E4 wearable.While using the Virtual Reality (VR) headset, their HR merged with EDG and was captured using an application.The signals captured can then be accessed from the Empatica website using the cloud-based server when the experiment is finished.The participant's raw electrodermography (EDG) data and their heart rate were the two main components employed in this experiment.The experiment's complete run time was 5 minutes and 24 seconds.The heart rate (HR) and EDG data were integrated in an Excel file.As a result, the HR merged with EDG data is synchronized to a four-fold multiplication.

Experiment Phase
Figure 3 depicts the stages of the experiment, from data collection to results production, including data pre-processing, classification, and result generation.The wearable was used during the data collection phase to monitor the participants' heart rates (HR) and electrodermographs (EDG) as they watched 360-degree clips while wearing a virtual reality (VR) headset.The data is then handled in the Data Pre-Processing step by integrating the EDG and HR data together in a spreadsheet and syncing the signal based on the quadrant with the video's start time.The data classification begins with SVM, Random Forest followed by K-Nearest Neighbor (KNN) classifier for intra-subject classification with merged EDG and HR data.The classification of emotion into four classes produced result based on their accuracy.During the experiment, KNN performed the best and generate the highest accuracy for intra-subject classification.This paper aims to discuss and present the result retrieved from the experiment.The findings and analysis of the experiment are discussed in the section that follows.

Result for KNN Classifier using HR merged with EDG Emotion Classification
Thirty participants volunteered to take part in the study, using a wearable to measure their skin activity and heart rate (HR) while equipped with a virtual reality (VR) headset to view 360-degree films.The experiment's goal was to capture the participants' feelings as they watched the 360 videos while the wearable was equipped.The data for 30 participants are shown in the following graphs for intra-subject classification.The accuracy of classification for intra-subject utilizing HR merged with EDG data and a KNN classifier for 30 participants is shown in Figure 4.The highest accuracy achieved is 100% which was achieved from 11 Participants.With majority exceeded 90% accuracy with 84.6% being the lowest accuracy achieved by participant 27.Next, the conclusion of the paper is discussed.

Conclusion
To These findings demonstrated that a workable strategy with VR stimuli can provide excellent four-class accuracy in emotion classification with HR merged together with EDG data and KNN classifier.Future researchers who are interested in investigating emotion in their field of study may be inspired by this discovery and consider using HR merged with EDG signals as the core signals for their experiment as a substitute to electrocardiogram (ECG) and electroencephalogram (EEG).
Testing additional classifiers from machine learning and deep learning to compare their accuracy performance to the current results, and attempting to increase classification accuracy for both intra-and inter-subject classification will be the focus in the future.

Fig. 1 .
Fig. 1.The participants wearing VR to stimulate their emotion

Fig. 2 .
Fig. 2. Russell's Two-dimensional model Of Emotion based on Valence and Arousal.

Fig. 4 .
Fig. 4. KNN Intra-subject Classification accuracy using HR merged with EDG classify emotions into four classes, this research explores the use of electrodermography of 100%, while 29 out of 30 participants exceeded 90% accuracy with the lowest accuracy achieved was 84.6%.
(EDG)merged with heart rate (HR) signals as well as K-Nearest Neighbor (KNN).Instead of applying a method such as two-class, which typically only includes emotions that are negative and positive, arousal and valence were added as additional dimensions in the fourclass.VR assists in improving the emotion classification in four classes.Referring to the findings in Results and Discussion on section 3, 11 out of the 30 subjects achieved accuracy Accuracy