Computing Professor Explores Mixed-Reality Cure for Zoom Fatigue
Looking to reduce the stress that confronts people who videoconference too much — widely known as Zoom fatigue, following the work-from-home trend of the COVID pandemic — one expert from NJIT is working on new approaches to augmented and virtual reality systems.
Associate Professor Jacob Chakareski said his research was inspired by his own less-than-ideal experiences in switching to remote lecturing in spring 2020, along with the experiences of co-principal investigators Ramesh Sitaraman and Michael Zink of the University of Massachusetts-Amherst, plus Klara Nahrstedt of the University of Illinois Urbana-Champaign, who collectively received a three-year, $1.2 million National Science Foundation grant.
Mixed-reality systems, Chakareski said, can help people deal with the unnatural online meeting environment that emphasizes the participants’ isolation and potentially leaves them feeling marginalized, unseen, uncomfortable and less able to focus, resulting in less productive interactions. Participants may not be visible and unsure of when to talk or where to look. The problems are even worse when some people are remote and others are together in person.
Instead, he said, online meetings could become an immersive and vivid meeting environment that offers participants a vivid experience that better simulates the feeling of in-person interaction, hence higher productivity.
“The current technology is good, but the new normal requires us to take it further. What happens over platforms such as Zoom, Webex or Microsoft Teams is not natural,” Chakareski stated. “We will provide a solution to that, allowing participants to suspend their disbelief through visual teleportation.”
The researchers call their project miVirtualSeat. Locally present people will wear augmented-reality headsets through which it will appear as if remote attendees, in the form of point-cloud volumetric videos, are also locally present, sitting in physical chairs around the local table. Similarly, the remote participants will see the meeting room and physical participants, displayed in their own virtual-reality devices. This type of system would, for example, allow a course instructor to engage in a natural-feeling classroom-style discussion with a group of students, some of whom are physically present in the same room and some of whom are remotely occupying virtual seats.
The NSF project will focus on challenges that must be overcome to achieve that vision. Specifically, the team will develop ways to detect, track, and localize distributed physical and virtual 360-degree avatars and objects in a joint immersive scene in real time; to reduce the bandwidth and latency of delivering integrated and synchronized 360-degree, volumetric, and 2D/3D video and ambisonics audio; and to ensure good quality-of-experience in the form of natural interactions between physical and virtual participants.
Chakareski and his collaborators are looking forward to putting together a prototype miVirtualSeat system on the NJIT campus, building on his recent and ongoing work. “We are going to create and experiment with the envisioned telepresence system across all three universities, working with AR/VR/360 cameras [and] 3D cameras,” he said. “If all goes according to plan, this could change the way the world interacts – and that is very exciting.”