Facial Expressions of Pain Among Patients in the Emergency Department


 Objective: The use of facial expression tracking to measure pain in patients in an emergency clinical setting has not been used before. This study examined facial expressions via an action unit (AU)-based model to determine the pain levels of emergency department (ED) patients.Methods: Adult patients admitted to the ED due to complaints of headache, chest pain, abdominal pain, backache, or painful limbs at triage were included. Two assessments, each lasting at least 30 s, were performed for each participant. A basic description of the pain was acquired, including pain intensity, using a self-report numerical rating scale (NRS). To identify the characteristics of facial expressions of pain, 18 facial AUs were assessed. The model was developed according to analyses of the significantly correlated facial AUs. Area under the receiver operating characteristic curves (AUC) of the AU-based model for different pain severities was calculated.Results: 429 video recording sessions were enrolled for analysis. 57.7% were male and the mean age was 51.3 years. Abdominal pain (49.4%) was the most common complaint, followed by back/limb pain (12.3%). The initial and follow-up mean NRS scores were 6.4 and 3.7, respectively. Several AUs showed significant correlations with NRS scores, including AU 1 (brow raising, p=0.024), AU 4 (brow lowering, p<0.001), and AU 26 (jaw dropping, p=0.038). A model to predict pain severity that included AUs 1, 4, 5, 9, 26, and 45 was created. The AUC of the prediction model to identify severe pain and no pain from others showed a value of 0.633 and 0.645, respectively. Conclusions: Although this study identified several facial AUs correlated with patient self-report NRS pain scores, analysis of facial expressions alone failed to accurately predict pain among patients in the ED. Further studies should aim to develop a more comprehensive means of measuring pain via multiple behavioral measurements.


Introduction
Pain, which can be de ned as an unpleasant sensation experienced after a harmful event (1), represents a major reason for engaging with medical services (1)(2)(3). The type of pain experienced varies according to the nociceptors involved (5,6) Pain involves the detection, transmission, and modulation of signals by receptors (7,8). In addition to sensory-discriminative factors, motivational-affective and cognitive variables are also involved in the experience of pain (9). Furthermore, the interpretation of painful stimuli can be affected by cultural, personality-related, experiential, environmental, and emotional factors (7)(8)(9).
Previous studies have tended to measure pain via behavioral assessments, for example of facial expressions. Neural control of the facial musculature in response to pain involves both voluntary and involuntary actions (10). Ekman developed the Facial Action Coding System (FACS), which divides the face into 44 action unit (AUs) involved in the production of different facial expressions (11). The prototypic facial expression of pain engages the AUs responsible for brow lowering, orbital tightening, levator contraction, and eye closure (12). Training in the recognition of speci c cues, such as brow lowering, eye closing, nose wrinkling, and mouth opening, improves the ability to discriminate 'deceptive' pain expressions (13,14). Painful facial expressions involve a combination of emotions, including anger, fear, and disgust (13)(14)(15). Previous studies have demonstrated the utility of the automated FACS for monitoring pain (16,17). However, the use of facial expression tracking to measure pain in patients in an emergency clinical setting has been limited. To facilitate pain measurement in the emergency department (ED), analyses of the facial expressions of patients in pain should be performed. This study examined facial expressions via an AU-based model to determine the pain levels of ED patients. We also evaluated whether the model is suitable for future use as a tool for pain measurement in ED patients.

Study design
This study used a cross-sectional design. Convenience sampling of patients admitted to the ED of a university-a liated hospital over a 1-year period was performed.

Ethics approval and consent to participate
This study was approved by the Ethics Committee of Chang Gung Medical Foundation, Taipei, Taiwan (approval no. 104-3625B). Informed consent was obtained from all participants.

Patients
Adult patients (≥ 18 years old) admitted to the ED due to complaints of headache, chest pain, abdominal pain, backache, or painful limbs at triage were included. Patients who had undergone trauma, received prior treatment, were lost to follow-up, were unwilling to participate, or had been referred from another center were excluded.

Study protocol
The primary investigators and research assistants identi ed adult patients who met the inclusion criteria at ED triage. Then informed consents was obtained from the patients. Two assessments, each lasting at least 30 s, were performed for each participant (i.e., at initial triage and approximately 1 h after the administration of treatment, if any). A basic description of the pain was acquired, including its location, duration, and intensity, using a numerical rating scale (NRS) ranging from 0 (no pain) to 10 (worst pain imaginable) (18,19). NRS scores of 1-4, 5-7, and 8-10 were considered to indicate mild, moderate, and severe pain, respectively (18,19). In order to standardize patients' attention, the primary investigators or research assistants sits behind an HDR Handycam® (Sony, Tokyo, Japan), which is mounted on a tripod, to acquire pain description and the video record in a designated assessment room. The patients received Page 4/13 indication to look at the camera and answer the questions. The camera was positioned to allow consistent capture of facial expressions between sessions.

Facial feature extraction
The open-access OpenFace (ver. 0.4.0), facial behavior analysis software, was used to extract facial features from the videos and generate AUs. In a pre-processing step using a facial feature-detecting algorithm, constrained local neural elds (CLNFs) were employed to detect facial landmarks in all video recordings using a point-distribution model. A layered, unidirectional graphical model was also applied by local neural eld patch experts, together with an optimization-tting procedure (non-uniform regularized landmark mean-shift) (20). In total, 68 facial landmarks (located at the eyes and nose contour, for example) were tracked based on the active orientation model generated for each patient ( Fig. 1) (21). To identify the characteristics of facial expressions of pain, 18 facial AUs were assessed (Table 1). Each video was divided into several 0.03 s frames for facial AU coding and analyses. The mean amplitude of the AUs was calculated as the sum of the individual frames.

Model covariates
Covariates were entered into the models, including patient age (years), sex, chief complaint, body part affected, NRS pain score at the initial and follow-up sessions, duration of video recordings, vital signs at ED triage (i.e., body temperature, heart rate, blood pressure, and respiration rate) and analgesia provision and hospital admission status.

Outcome measures
The main outcome measure was the area under the receiver operating characteristic (ROC) curves of the AU-based model for different pain severities; the model was developed according to analyses of the signi cantly correlated facial AUs.

Statistical analyses
The demographic characteristics of the participants are presented as numbers (%) and means (standard

Results
In total, 241 participants (429 video recording sessions) were enrolled. Table 1 shows the basic characteristics of the participants; 57.7% were male and the mean age was 51.3 years. None had beards nor nasal cannula/oxygen facial mask use. All participants were Asians. Abdominal pain (49.4%) was the most common complaint at ED triage, followed by back/limb pain (12.3%). The initial and follow-up mean NRS scores were 6.4 and 3.7, respectively. In total, 140 participants (58.1%) had taken medications to alleviate pain and 26.1% were admitted to the hospital.  Then the performance of the prediction model was examined by calculating the AUC after plotting the ROC curves of the different pain severities ( Table 3). The AUC of the prediction model showed a minimum value of 0.528 (moderate pain group) and a maximum value of 0.645 (no pain group). In analyses of pain location by severity, the lowest AUC was 0.532 (abdomen, moderate pain) and the highest was 0.770 (abdomen, no pain).  These ndings echo previous results with respect to the AUs most associated with pain expression, namely, AUs 4 and 9 (12,22,23). Various other AUs related to pain expression have been reported, such as AU 6 (cheek raising), AU 7 (eyelid tightening), AU 10 (upper lip raising), AU 12 (lip pursing), AU 16 (lower lip depression), AU 25 (lip parting), and AU 43 (eye closing) (12,22,23). The majority of previous studies have recorded facial pain expressions induced by speci c stimuli or exercises (12,14,(24)(25)(26)(27).
The method used in this study was different, being based largely on observation without application of painful stimuli. Furthermore, the characteristics of patient populations are likely to differ among studies, for example in terms of the intensity and causes of pain. Finally, the clinical setting and purpose of pain measurements can differ among studies, where in this study the data were obtained in an ED setting. In summary, although universal AUs underlying painful facial expression might exist, differences remain among studies that could in uence their identi cation.
This study created a prediction model based on the correlations among facial AUs. However, the predictive power of the model was only fair to poor, in terms of both identifying those experiencing no pain (AUC, 0.645; 95% CI: 0.528-0.761), and those suffering from severe pain (AUC, 0.633; 95%CI: 0.571-0.694). Subgroup analyses of different pain sites again revealed only fair to poor predictive ability of the model. Several aspects should be discussed when interpreting our results. First, Hadjistvropoulos et al. indicated that the overall facial activity (total frequency and average intensity), not individual one, differentiate painful from non-painful periods in hospitalized elders (28). Further research also con rmed the similar ndings by induced pain among older adults (29). However, the authors noticed that AU 43(eye closing) and 45(blinking) showed high frequency during the pain segments. These ndings consist with our results. The face action coding system identi ed actions of facial muscles involved for recognition of the basic emotions. Examples of emotion-related facial actions are sadness: AU 1, 4, 5; fear: AU 1, 2, 4, 5, 7, 20, 26; anger: 4, 5, 7, 23; disgust AU 9, 15, 16 (30). Patients in pain, who visit ED, may present with mixed emotions in varying degree, such as sadness, fear, anger, and disgust. Therefore, a prediction model, which made up of speci c AUs, limited its performance.
Second, there might be a difference between short-and long-lasting emotional facial expressions. Different facial expressions were associated with particular types of pain stimuli at different timing (31).
Unlike laboratory experiments, in which facial expressions are recorded right after pain stimuli, the timing of recordings obtained in our study varied, which might result in reduced performance of the model in terms of predicting pain.
Third, this study used baseline-corrected amplitudes derived from the mean AUs of all 0.03 s video frames. For those participants who came without emotional change during ED visit will potentially loss capture of the signal. This process might compromise the sensitivity of AUs, which could have reduced the effectiveness of the model.
Fourth, the experience of pain differs according to the nociceptors that are stimulated; pain types include super cial and deep somatic pain, as well as visceral pain (5,6). Whether different types of pain are re ected in the same facial expressions remains unclear. In this study, pain was categorized by body part. Determining the type of pain experienced by an individual relies on the clinical judgment of ED physicians, who may be operating according to different principals and personal experiences. To avoid any bias arising from subjective judgments, the type of pain was not acquired. However, our results appear to suggest that facial pain expressions differ according to the body part.
According to our results, there are barriers to replacing current pain assessment methods with facial expression analyses performed in the ED. Pain is subjective, involving sensory-discriminative, motivational-affective, and cognitive factors (9). Furthermore, pain experience is also affected by the culture, personality, experiences, environment, and emotional state of a patient (7-9). To date, self-report NRS scores have served as the primary means of measuring pain. Instead of replacing these self-reported pain scores, using additional behavioral and physiological measurement techniques as an adjunct might facilitate more accurate pain assessment (32). Comprehensive behavioral assessments encompassing several methods, including facial expression analyses, are warranted. Improving the care of patients in pain by integrating such methods should be the ultimate goal.

Limitations
There were several limitations to the present study. First, sampling and selection biases were present, as we applied a convenience sampling method. Furthermore, several patients declined to be videoed and no patients who required resuscitation or emergent procedures were included. Therefore, the study population comprised mainly patients of triage level 3-5 who were willing to have their facial expressions recorded. The triage level 1 and 2 patients were in critical condition and were triaged according to variables other than NRS pain scores (e.g., unconsciousness, respiratory distress, or circulatory collapse status). Thus, our study population may not be representative of all patients in pain. Second, unmeasured confounders may have been in play, where facial expressions might be affected by the type, cause, and intensity of pain, as well as by the emotional state, personality, experiences, environment, and culture of the patient. Most of these variables are di cult to measure and thus were not assessed in this study. Third, the study included only a limited number of participants, all of whom drawn from a single university-a liated hospital in northern Taiwan; this may limit the generalizability of our ndings. Further studies conducted in different settings would therefore be of value.

Conclusion
Although this study identi ed several facial AUs correlated with NRS pain scores, analyses of facial expressions alone failed to accurately predict pain among patients in the ED. Further studies should aim to develop a more comprehensive means of measuring pain via multiple behavioral measurements. In addition, to facilitate pain assessment and ultimately improve the care of patients in pain, drawing on relevant results pertaining to pain experience would be a valid approach.