Search results
1 – 10 of over 4000Jihye Park and Arim Kim
This study aims to examine the following issues: whether consumers use a dog’s facial expressions and gaze on a product’s packaging to interpret the emotions of a dog and evaluate…
Abstract
Purpose
This study aims to examine the following issues: whether consumers use a dog’s facial expressions and gaze on a product’s packaging to interpret the emotions of a dog and evaluate product quality and how owner identification with the dog moderates the effect of a dog’s facial expressions on product evaluations.
Design/methodology/approach
A field study and three lab experiments were conducted to examine the moderating roles of a dog’s gaze on the product package (Study 1) and owner–dog identification (Study 2) in the effect of facial expressions of a dog on product evaluations.
Findings
Results showed that the facial expressions of a dog presented on the product package influenced the perceived mood of a dog and product quality evaluation. The effects of the facial expressions were strengthened when the dog looked at the front. Furthermore, those who were more likely to identify with their dog tended to be more responsive to the dog with a smiling face and evaluated the product quality more positively than those who were less likely to identify with their dog.
Practical implications
Marketing practitioners in the pet industry can use the findings of this study to select and place an appropriate pet image on the product package. Happy facial expressions and the direct gaze of a pet can influence positive evaluations of a product and, as a result, increase the purchase intention. Product managers also can place words, phrases or images on the product package that highlight a dog as an inseparable part of the owner’s everyday life and as a representation of his/her identity. Emphasizing the owner’s dog as an extension of him/herself or a part of his/her identities can encourage the active processing of a dog’s facial expressions on the product package and the positive evaluation of a product.
Originality/value
The present work adds valuable empirical findings to the limited marketing literature for the pet-related industry. The results of the experiments showed how consumers process the facial expressions and gaze of a dog and use them to infer the quality of a product. Furthermore, the findings extend prior literature reporting that dog owners with a greater identification are more likely to humanize their pet dogs and develop empathetic abilities.
Details
Keywords
Ruijuan Wu, Xiaoqian Ou and Yan Li
The objective of this study is to examine the effect of human model facial presentation (a smiling facial expression vs a neutral facial expression vs no facial presentation) on…
Abstract
Purpose
The objective of this study is to examine the effect of human model facial presentation (a smiling facial expression vs a neutral facial expression vs no facial presentation) on consumers' approach behavior and to determine the mechanism and boundary conditions behind such effects.
Design/methodology/approach
The research consisted of four laboratory experiments.
Findings
The results of four studies showed that a smiling facial expression led to the highest score for approach behavior. Pleasure and arousal mediated the effect of facial presentation on approach behavior. In the relationship between facial presentation and approach behavior, the moderating effects of emotional receptivity and the situation were significant. To be specific, for participants with high emotional receptivity, smiling facial expressions led to the highest approach behavior; for participants with low emotional receptivity, neutral expressions led to the highest approach behavior. In a browsing situation, the approach behavior of participants in response to a smiling facial expression was the highest. However, no significant differences were found in approach behavior under the three conditions regarding a purchasing situation.
Originality/value
This study supplements the literature on human model presentation and enriches the study of facial expressions.
Details
Keywords
Matteo Sorci, Thomas Robin, Javier Cruz, Michel Bierlaire, J.-P. Thiran and Gianluca Antonini
Facial expression recognition by human observers is affected by subjective components. Indeed there is no ground truth. We have developed Discrete Choice Models (DCM) to capture…
Abstract
Facial expression recognition by human observers is affected by subjective components. Indeed there is no ground truth. We have developed Discrete Choice Models (DCM) to capture the human perception of facial expressions. In a first step, the static case is treated, that is modelling perception of facial images. Image information is extracted using a computer vision tool called Active Appearance Model (AAM). DCMs attributes are based on the Facial Action Coding System (FACS), Expression Descriptive Units (EDUs) and outputs of AAM. Some behavioural data have been collected using an Internet survey, where respondents are asked to label facial images from the Cohn–Kanade database with expressions. Different models were estimated by likelihood maximization using the obtained data. In a second step, the proposed static discrete choice framework is extended to the dynamic case, which considers facial video instead of images. The model theory is described and another Internet survey is currently conducted in order to obtain expressions labels on videos. In this second Internet survey, videos come from the Cohn–Kanade database and the Facial Expressions and Emotions Database (FEED).
Jian-Ren Hou and Sarawut Kankham
Fact-checking is a process of seeking and displaying facts to confirm or counter uncertain information, which reduces the spread of fake news. However, little is known about how…
Abstract
Purpose
Fact-checking is a process of seeking and displaying facts to confirm or counter uncertain information, which reduces the spread of fake news. However, little is known about how to promote fact-checking posts to online users on social media. Through uncertainty reduction theory and message framing, this first study examines the effect of fact-checking posts on social media with an avatar on online users' trust, attitudes, and behavioral intentions. The authors further investigate the congruency effects between promotional message framing (gain/loss/neutral) and facial expressions of the avatar (happy/angry/neutral) on online users' trust, attitudes, and behavioral intentions in the second study.
Design/methodology/approach
The authors conducted two studies and statistically analyzed 120 samples (study 1) and 519 samples (study 2) from Facebook users.
Findings
Results showed that including the neutral facial expression avatar in fact-checking posts leads to online users' greater trust and more positive attitudes. Furthermore, the congruency effects between loss message framing and the angry facial expression of the avatar can effectively promote online users' trust and attitudes as well as stronger intentions to follow and share.
Originality/value
This study offers theoretical implications for fact-checking studies, and practical implications for online fact-checkers to apply these findings to design effective fact-checking posts and spread the veracity of information on social media.
Details
Keywords
Kuan Cheng Lin, Tien‐Chi Huang, Jason C. Hung, Neil Y. Yen and Szu Ju Chen
This study aims to introduce an affective computing‐based method of identifying student understanding throughout a distance learning course.
Abstract
Purpose
This study aims to introduce an affective computing‐based method of identifying student understanding throughout a distance learning course.
Design/methodology/approach
The study proposed a learning emotion recognition model that included three phases: feature extraction and generation, feature subset selection and emotion recognition. Features are extracted from facial images and transform a given measument of facial expressions to a new set of features defining and computing by eigenvectors. Feature subset selection uses the immune memory clone algorithms to optimize the feature selection. Emotion recognition uses a classifier to build the connection between facial expression and learning emotion.
Findings
Experimental results using the basic expression of facial expression recognition research database, JAFFE, show that the proposed facial expression recognition method has high classification performance. The experiment results also show that the recognition of spontaneous facial expressions is effective in the synchronous distance learning courses.
Originality/value
The study shows that identifying student comprehension based on facial expression recognition in synchronous distance learning courses is feasible. This can help instrutors understand the student comprehension real time. So instructors can adapt their teaching materials and strategy to fit with the learning status of students.
Details
Keywords
Giuliana Isabella and Valter Afonso Vieira
The purpose of this paper is to investigate the emotional contagion theory in print ads, and expand the literature of smiling to different type of smiles and gender congruency…
Abstract
Purpose
The purpose of this paper is to investigate the emotional contagion theory in print ads, and expand the literature of smiling to different type of smiles and gender congruency. Emotional contagion happens when an emotion is transferred from a sender to a receiver by the synchronization of emotions from the emitter. Drawing on emotional contagion theory, the authors expand this concept and propose that smiles in static facial expressions influence product evaluation. They suggest that false smiles do not have the same impact as genuine smiles on product evaluation, and the congruence between the model gender–product in a static ad and the gender of the viewer moderates the effects.
Design/methodology/approach
In Experiment 1, subjects were randomly assigned to view one of the two ad treatments to guard against systematic error (e.g. bias). In Experiment 2, it was investigated whether viewing a static ad featuring a model with a false smile can result in a positive product evaluation as was the case with genuine smiles (H3). In Experiment 3, it was assumed that when consumers evaluate an ad featuring a smiling face, the facial expression influences product evaluation, and this influence is moderated by the congruence between the gender of the ad viewer and the product H gender of the model in the ad.
Findings
Across three experiments, the authors found that the model’s facial expression influenced the product evaluation. Second, they supported the association between a model’s facial expression and mimicry synchronization. Third, they showed that genuine smiles have a higher impact on product evaluation than false smiles. This novel result enlarges the research on genuine smiles to include false smiles. Fourth, the authors supported the gender–product congruence effect in that the gender of the ad’s reader and the model have a moderating effect on the relationship between the model’s facial expression and the reader’s product evaluation.
Originality/value
Marketing managers would benefit from understanding that genuine smiles can encourage positive emotions on the part of consumers via emotional contagion, which would be very useful to create a positive effect on products. The authors improved upon previous psychological theory (Gunnery et al., 2013; Hennig-Thurau et al., 2006) showing that a genuine smile results in higher evaluation scores of products presented in static ads. The theoretical explanation for this effect is the genuine smile, which involves contraction of both zygomatic major and orbicularis oculi muscles. These facial muscles can be better perceived and transmit positive emotions (Hennig-Thurau et al., 2006).
Details
Keywords
Facial expression provides abundant information for social interaction, and the analysis and utilization of facial expression data are playing a huge driving role in all areas of…
Abstract
Purpose
Facial expression provides abundant information for social interaction, and the analysis and utilization of facial expression data are playing a huge driving role in all areas of society. Facial expression data can reflect people's mental state. In health care, the analysis and processing of facial expression data can promote the improvement of people's health. This paper introduces several important public facial expression databases and describes the process of facial expression recognition. The standard facial expression database FER2013 and CK+ were used as the main training samples. At the same time, the facial expression image data of 16 Chinese children were collected as supplementary samples. With the help of VGG19 and Resnet18 algorithm models of deep convolution neural network, this paper studies and develops an information system for the diagnosis of autism by facial expression data.
Design/methodology/approach
The facial expression data of the training samples are based on the standard expression database FER2013 and CK+. FER2013 and CK+ databases are a common facial expression data set, which is suitable for the research of facial expression recognition. On the basis of FER2013 and CK+ facial expression database, this paper uses the machine learning model support vector machine (SVM) and deep convolution neural network model CNN, VGG19 and Resnet18 to complete the facial expression recognition.
Findings
In this study, ten normal children and ten autistic patients were recruited to test the accuracy of the information system and the diagnostic effect of autism. After testing, the accuracy rate of facial expression recognition is 81.4 percent. This information system can easily identify autistic children. The feasibility of recognizing autism through facial expression is verified.
Research limitations/implications
The CK+ facial expression database contains some adult facial expression images. In order to improve the accuracy of facial expression recognition for children, more facial expression data of children will be collected as training samples. Therefore, the recognition rate of the information system will be further improved.
Originality/value
This research uses facial expression data and the latest artificial intelligence technology, which is advanced in technology. The diagnostic accuracy of autism is higher than that of traditional systems, so this study is innovative. Research topics come from the actual needs of doctors, and the contents and methods of research have been discussed with doctors many times. The system can diagnose autism as early as possible, promote the early treatment and rehabilitation of patients, and then reduce the economic and mental burden of patients. Therefore, this information system has good social benefits and application value.
Details
Keywords
recent years, facial expression recognition has been widely used in human machine interaction, clinical medicine and safe driving. However, there is a limitation that conventional…
Abstract
Purpose
recent years, facial expression recognition has been widely used in human machine interaction, clinical medicine and safe driving. However, there is a limitation that conventional recurrent neural networks can only learn the time-series characteristics of expressions based on one-way propagation information.
Design/methodology/approach
To solve such limitation, this paper proposes a novel model based on bidirectional gated recurrent unit networks (Bi-GRUs) with two-way propagations, and the theory of identity mapping residuals is adopted to effectively prevent the problem of gradient disappearance caused by the depth of the introduced network. Since the Inception-V3 network model for spatial feature extraction has too many parameters, it is prone to overfitting during training. This paper proposes a novel facial expression recognition model to add two reduction modules to reduce parameters, so as to obtain an Inception-W network with better generalization.
Findings
Finally, the proposed model is pretrained to determine the best settings and selections. Then, the pretrained model is experimented on two facial expression data sets of CK+ and Oulu- CASIA, and the recognition performance and efficiency are compared with the existing methods. The highest recognition rate is 99.6%, which shows that the method has good recognition accuracy in a certain range.
Originality/value
By using the proposed model for the applications of facial expression, the high recognition accuracy and robust recognition results with lower time consumption will help to build more sophisticated applications in real world.
Details
Keywords
Andreas Schwab, Yanjinlkham Shuumarjav, Jake B. Telkamp and Jose R. Beltran
The use of artificial intelligence (AI) in management research is still nascent and has primarily focused on content analyses of text data. Some method scholars have begun to…
Abstract
The use of artificial intelligence (AI) in management research is still nascent and has primarily focused on content analyses of text data. Some method scholars have begun to discuss the potential benefits of far broader applications; however, these discussions have not led yet to a wave of corresponding AI applications by management researchers. This chapter explores the feasibility and the potential value of using AI for a very specific methodological task: the reliable and efficient capturing of higher-level psychological constructs in management research. It introduces the capturing of basic emotions and emotional authenticity of entrepreneurs based on their macro- and microfacial expressions during pitch presentations as an illustrative example of related AI opportunities and challenges. Thus, this chapter provides both motivation and guidance to management scholars for future applications of AI to advance management research.
Details
Keywords
Fowei Wang, Bo Shen, Shaoyuan Sun and Zidong Wang
The purpose of this paper is to improve the accuracy of the facial expression recognition by using genetic algorithm (GA) with an appropriate fitness evaluation function and…
Abstract
Purpose
The purpose of this paper is to improve the accuracy of the facial expression recognition by using genetic algorithm (GA) with an appropriate fitness evaluation function and Pareto optimization model with two new objective functions.
Design/methodology/approach
To achieve facial expression recognition with high accuracy, the Haar-like features representation approach and the bilateral filter are first used to preprocess the facial image. Second, the uniform local Gabor binary patterns are used to extract the facial feature so as to reduce the feature dimension. Third, an improved GA and Pareto optimization approach are used to select the optimal significant features. Fourth, the random forest classifier is chosen to achieve the feature classification. Subsequently, some comparative experiments are implemented. Finally, the conclusion is drawn and some future research topics are pointed out.
Findings
The experiment results show that the proposed facial expression recognition algorithm outperforms ones in the existing literature in terms of both the actuary and computational time.
Originality/value
The GA and Pareto optimization algorithm are combined to select the optimal significant feature. To improve the accuracy of the facial expression recognition, the GA is improved by adjusting an appropriate fitness evaluation function, and a new Pareto optimization model is proposed that contains two objective functions indicating the achievements in minimizing within-class variations and in maximizing between-class variations.
Details