Product Review Based on Facial Expression Detection

—The research presents a method for assessing public acceptability of items based on their brand by analysing the facial expression of a consumer who intends to purchase the product at a supermarket. In such circumstances, face expression detection is crucial in product evaluation. Emotions are conveyed through facial expressions. Sentimental analysis is a type of natural language processing that may be used for a variety of purposes. As a result, several techniques to classifying human emotional states have been proposed. The extraction of feature points via a cascade classiﬁer is used to identify facial expressions, which minimizes the time complexity. The owner can view the feedback of the the reviewed product. This product ranking will assist the business owner in increasing product sales while also ensuring that the top products are available for the customers.


I. INTRODUCTION
Human facial expressions have the power to communicate sentiments and emotions, which in turn determine how people react. Facial expression is an adaptive reaction that reveals a person's mental state while doing a task (e.g. expression of a consumer while purchasing a product). Machines that analyze human facial expressions have been a hot topic of study. Variations in expressions, the necessity for quick processing, and product-specific applications are the key issues in facial expression detection. Some products require more precise and quicker algorithms, and balancing these two characteristics is a key challenge [3].
In practice, analyzing a customer's facial expression and voice at that time may be quite useful. For instance, statistics on consumer curiosity might provide insight into a consumer's level of interest in a certain product [2].
Facial expression recognition is one of the most significant applications of image processing. The looks on our faces disclose our emotions. In interpersonal communication, facial expressions are crucial.
It is a non-verbal scientific gesture in which our emotions are reflected on our faces. Face recognition plays a major part Identify applicable funding agency here. If none, delete this. in artificial intelligence and robotics, and is hence a generational requirement. The goal is to create an Automatic Facial Emotion Recognition System that can detect and categorize human facial photos comprising various expressions into seven different expression classes, including Neutral, Angry, Disgust, Fear, Happy, Sadness, and Surprise.

II. LITERATURE SURVEY
In this paper, a method is suggested for detecting facial emotion of a customer for a given prodect. The facial expression is categorised as angry, disgust, fear, happy, sad, surprise and neutral.
Vikrant Chaugule, Abhishek D, Aadheeshwar Vijayakumar, Pravin Ramteke, and Shashidhar Koolagudi: They suggest a technique to evaluate public acceptability of items based on their brand by studying the facial expression of a client planning to purchase the product from a supermarket or hypermarket in this article. The extraction of feature points is used to detect facial expressions using a modified Harris algorithm [1]. The suggested approach is compared to current techniques in terms of time complexity. To minimize computational complexity, the Harris method for feature point extraction has been updated.
Preeti Thakre and Pankaj Agarkar: In this paper, the impact of semantic categorization on NLP tasks is investigated in this research. They look at why emotion and sentiment analysis might help models be more accurate [2]. In word vector space, emotional words have good emotional semantic discrimination, whereas feelings have greater discrimination in emotional semantic space.
They present a Python-based novel Emotion-Semantic Enhanced Convolutional Neural Network Model that generates the emotional space using vectors corresponding to sentiments for Emotion Recognition on Amazon Product Reviews. In comparison to other models, the ECNN model is better at capturing emotional semantics. The system's future capabilities include recognizing emotion from user-uploaded material.

III. METHODOLOGY
The proposed approach takes a real time video or recorded video as input and extract frames (images) at regular intervals. Once the face has been retrieved, the bottom one-third of it is utilised to identify the mouth and lips, while the top onethird is used to detect the eyes. Using the suggested technique, feature points are recovered from both the lips and the eyes. The curious ratio, which is determined from the feature points of the eyes and mouth, is used to identify facial emotions. Curiosity, indifference, contentment, and a person's degree of enthusiasm may all be determined from facial faces.

) Face Registration
Face Registration is a computer technique that recognises human faces in digital photographs and is used in a range of applications. Faces are initially found in the picture using a series of landmark points known as "face localization" or "facial detection" in this face registration process. In a procedure known as "face registration," these discovered faces are subsequently geometrically normalised to match a template image.

3) Facial Feature Extraction:
The technique of finding certain areas, points, landmarks, or curves/contours in a given 2-D picture or 3D range image is known as facial features extraction. The resultant registered picture is used to produce a numerical feature vector in this feature extraction stage. Facial expression recognition is a human or computer-based procedure that includes the following steps: a) Facial detection in a scene (e.g. in image; this step is also referred to as face detection) b) Facial characteristics are extracted from the identified face region (e.g. face feature extraction is the process of recognizing the form of facial components or characterizing the texture of the skin in a facial region.) c) Analyzing the motion of facial features and changes in their appearance, and categorizing this information into facial-expression interpretative categories such as facial muscle activation (smile or frown), emotion (affect) categories like happiness or anger, attitude categories like (dis)liking or ambivalence, and so on. The algorithm attempts to categorise the provided faces depicting one of the seven fundamental emotions in the third phase of categorization. According to numerous research studies, there are four essential procedures that must be completed in order to undertake this work.

A. DeepFace
In this paper, we are going to detect the facial expression of an already existing image using OpenCV, Deepface, and matplotlib modules in python. Deepface is a Python face recognition and facial attribute analysis module that is extremely lightweight. The open-source DeepFace library comprises all cutting-edge AI face recognition models and conducts all facial recognition operations in the background. When you use DeepFace for facial recognition, you receive access to the following features: 1) Face Verification: Face verification is the process of comparing one face to another to see if they are same. As a result, face verification is frequently used to match a candidate's face to that of another contender. This may be used to verify if a person's physical appearance matches that of an ID document.

2) Face Recognition:
The process includes searching a picture database for a face. Face recognition necessitates a large number of runs of face verification.

3) Facial Attribute Analysis:
Facial attribute analysis is a task that involves evaluating the visual features of face photographs. Face characteristics analysis is therefore used to extract features including age, gender categorization, mood analysis, and race/ethnicity prediction.

4) Real-Time Face Analysis:
With the real-time video stream from your camera, you may test face recognition and facial attribute analysis.

B. System Implementation
Steps: 1) Collection of real time data (48-by-48-pixel grayscale images of faces each labeled with one of the 7 emotion classes: anger, disgust, fear, happiness, sadness, surprise, and neutral). 2) Images Pre-processing.
3) Detection of a face from each image. 4) Grayscale photographs of the cropped face are produced. 5) The pipeline ensures every image can be fed into the input layer as a (1, 48, 48) numpy array. 6) The numpy array gets passed into the Convolu-tion2D layer. 7) Feature maps are created through convolution. 8) The feature map only keeps the maximum pixel value. 9) The pixel values were subjected to forward and backward propagation. 10) The model can predict which emotion will be the dominating emotion in a given situation.

IV. RESULT AND DISCUSSION
The dominating emotion is predicted by reshaping an image obtained from a live video source. After 10 seconds, a timer array keeps track of the feelings and returns the most shown/captured emotion.
Step 1: The commodities for sale are added to the shop by the owner (admin) for reviewing purpose. . Step 2: The owner's (Admin) products are now visible to the customer (User) with the option of reviewing.
Step 3: Customers can now leave product reviews by using the CLICK ME button. . Step 4: Customer feedback "happy" will be recorded in the system. . Step 5: Customer feedback "Neutral" will be recorded in the system. . Step 6: The owner can now see the reviewed product. This product ranking will assist the business owner in increasing product sales while also ensuring that the top products are available for his customers. . Owner of several shops with a range of items may gain customer feedback by collecting real-time emotion analysis of the customer at the same time, and the gathered data can be evaluated by the owner for better decision-making.

V. CONCLUSION AND FUTURE WORK PLAN
The facial expression recognition system described in this paper presents a robust face recognition model based on the mapping of behavioural and physiological biometric variables. The physiological properties of the human face that are relevant to various expressions such as pleasure, sorrow, fear, anger, surprise, and disgust are linked to geometrical structures that are reconstituted as the recognition system's basis matching template. Our work focuses on analyzing live facial expressions of consumers who are viewing a certain product, allowing us to conduct a real-time assessment of that product and score it based on the customer's facial expression analysis results.This product rating will assist the business owner in increasing product sales while also ensuring that the top items are available for his clients. This feature is significantly more accurate and quicker than previous techniques, which had a greater margin of error.
Further,this work might be expanded to look at other elements for product evaluation outside emotions, such as the amount of time a consumer spends looking at a product (which would necessitate face recognition in supermarkets) and product review in online purchasing. Because everyone nowadays is lured to internet buying, such reviews may aid in the early detection of product failure, minimizing future production losses.