Voice Assistants 'Like Us' Affect How Users Process Misinformation, Study Suggests
Voice assistants like Amazon Alexa and Apple Siri now have the ears of an estimated 142 million users in the United States, but just how much we trust and listen to AI assistants for daily information may come down to how much we identify with them personality-wise.
That’s the takeaway from a study led by researchers at New Jersey Institute of Technology (NJIT) and Penn State University examining the impact of voice assistant (VA) personalities on user engagement and decision-making.
The study — published in the International Journal of Human-Computer Studies — has revealed that users not only enjoy their voice assistants more when they relate to their personality type, but they’re also much more critical of information they receive and will even reassess previously held beliefs on critical topics such as COVID-19.
Among the unexpected findings, 38% of the study’s participants who had previously opted against getting the COVID-19 vaccine changed their minds on the issue, with many becoming pro-vaccination after receiving COVID-related misinformation from a voice assistant with a personality like them, or one they customized.
“What this study shows us is that voice assistants can easily alter users’ perception about content they consume simply by using certain basic vocal attributes that the listener relates and is attracted to,” said Eugene Cho Snyder, an NJIT assistant professor of humanities and social sciences who led the study.
Researchers say the study adds evidence that a phenomenon in social psychology known as the similarity-attraction effect is possible between humans and virtual beings — highlighting a potentially influential factor in how we process information, and resist misinformation, from our smart technologies.
“To our surprise, this kind of similarity manipulation was powerful enough to sway the unvaccinated into considering vaccination. … This effect could work in many ways so we should be mindful of negative applications,” said Snyder. “While the calm female voices of Siri and Alexa are everywhere today, advances in voice technology could make VAs much more adaptive to things like our cadence, accent and even certain words or phrases that we frequently use. If those conversational attributes can influence users' way of interpreting text, it creates room for bias.”
In the study, 401 online participants were asked to complete a survey assessing their personality based on extraversion/introversion dimensions and disclose their COVID-19 vaccination status.
Afterward, participants were split into three groups: a control group that was randomly assigned an extroverted or introverted VA; a customization group that was allowed to choose from five voice personalities; and a personalization group whose members believed they were matched with a VA tailored to their personality type but were instead assigned a random VA like the control group.
Upon being introduced to their VA, a vast majority of participants preferred extroverted VAs featuring higher pitch, speed and volume delivery — echoing results from previous studies. However, Snyder’s team observed an unexpected result as users listened to manipulated audio clips containing misinformation about COVID-19 when asking their VA questions, such as “Who should NOT get the COVID-19 vaccine?”.
The experiment had a notable effect on the 108 unvaccinated participants — 41 responded by saying they would get vaccinated, and 25 of those who changed their minds interacted with a voice assistant with a similar personality.
Ultimately, 50% of the unvaccinated individuals assigned to the “personality-matched” condition changed their mind on getting vaccinated.
“According to our data, when participants were assigned a voice assistant that had a personality similar to their own, they showed this effect of resistance to persuasion,” said S. Shyam Sundar, study co-author and the James P. Jimirro Professor of Media Effects at Penn State. “It suggests that when you hear information from a voice assistant that is similar to yourself, you are likely to process the message more carefully. That careful processing is what makes you realize the persuasive angle of the messaging and why you show resistance to it.”
The team notes that because the study’s unvaccinated group represented 27% of participants, further research must be done to better understand the results.
“Our general expectation is that users would be more likely to believe misinformation coming from a voice assistant with a similar personality,” Sundar explained. “Since our data showed the opposite, it is important to replicate the study before applying these findings to practice.”
Snyder also plans to investigate the effects of other vocal attributes beyond pitch, speed and volume, as well as design conversational scripts that voice assistants can use to negotiate the use of voice data with users more effectively and transparently.
For now, the team says their latest observations suggest that voice assistant users’ overall experience could benefit from emerging speech synthesis technologies and more diverse voice assistant options, rather than the one-size-fits all voice assistant personalities of popular AI assistants today.
“As Alexa adapts to your voice quirks and even responds to external sounds like alarms going off, the personalization possibilities become endless,” said Snyder. “Thus, it is important to deepen understanding of the effects of voice interactive features, not only to improve user experience, but also to flag any risks of unfair or nontransparent utilization of users’ voice data, as well as offerings of voice-sensitive features and information.”