Have you ever felt hyper-aware of yourself while on a webcam? Manlu Liu, a 3rd year Ph.D student takes this question a step further: how does self-monitoring reshape our connection with others?
In today’s digital age, where interactions often occur behind screens, the nuances of human connection and emotion have never been more complex. At the heart of understanding these subtleties lies the ability to read and interpret affective facial expressions. This skill might be significantly hindered by our self-awareness, or ‘self-monitoring’.
Unveiling this intriguing concept through her research on the influence of self-monitoring on the perception of emotional cues is Manlu Liu, a 3rd Year Ph.D student working under the supervision of Dr. James T. Enns at UBC’s Vision Lab.
Manlu completed her undergraduate degree in Psychology at McGill University, where she worked as a Research Assistant in various labs, contributing to projects spanning multiple psychology research streams. Her work, particularly on social attention and cognition, deepened her fascination with the human capacity to comprehend and empathize with others’ thoughts and feelings. Manlu Liu chose UBC’s psychology program for its research diversity and alignment with her interests. This intrigue led her to join Dr. Enns’ lab as a Master’s student, where she began her focused research on social perception and attention.
Read our Q&A with Manlu as she delves into her academic journey and the specifics of her recent research on self-monitoring:
Can you describe how your study was conducted?
This study was conducted with the help and guidance of Dr. Veronica Dudarev (postdoctoral research fellow at UBC psychology department) and Dr. Enns. This study was conducted entirely online, and we used webcams and pre-programmed conversations to induce self-monitoring or other-monitoring in participants before they classified the affective facial expressions of video-recorded actors. The figure below shows the study procedures.
What key findings emerged from your research regarding self-monitoring and its impact on understanding facial expressions?
“We found that self-monitoring participants were less sensitive to other’s affective facial expressions.”
These participants performed worse when they were required to judge whether the facial expressions they saw were positive or negative, compared to participants who were monitoring others instead of themselves. This was especially true for participants who rated the pre-programmed conversations as high in believability.
How does self-monitoring affect individuals’ ability to interpret others’ facial expressions and how might this impact their social interactions and perceptions?
“The reduction in an individuals’ ability to interpret facial expressions when self-monitoring may have a negative impact on their social interactions.”
I am really interested in these downstream consequences of self-monitoring, but it will take another study to find out if that is really the case or not.
In light of your findings, what suggestion would you offer to individuals with high self-monitoring tendencies to improve their emotional recognition skills?
I would suggest individuals with high self-monitoring tendencies to use techniques to focus their attention on their surrounding environment, instead of on themselves.
“...simply being aware of the reduced sensitivity to others that comes with self-monitoring may help people reduce their self-focus.”
What are the next steps in your research?
Currently, we are applying similar methodologies from the self-monitoring study to investigate people’s perception of gaze patterns during conversation. For example, we often look away from a conversational partner, sometimes to think, and sometimes to look at something in our environment. We are curious how sensitive people are to these different reasons for looking away.
This research was conducted in the Vision Lab which is led by Dr. James T. Enns.