WiFi and Light-Based Networks Combine for Better Virtual Reality Systems
"Why not both?" is a social media meme, but it's also the proposed answer to a problem for the designers of virtual reality systems, who generally must rely on wireless signals that are reliable but not as fast as needed.
It's already hard enough to make virtual reality look realistic, without choppiness or delays, when a user is sitting still and plugged into a wired network, NJIT Associate Professor Jacob Chakareski explained. But making a system that's wireless, adding multiple users and incorporating six degrees of freedom — where a person controls the pitch, roll and yaw of their head, in addition to walking forward/backward, side-to-side and up-and-down — is an extraordinarily difficult challenge.
Chakareski and his former postdoctoral researcher, York University Assistant Professor Mahmudur Khan, devised a way to address these problems by streaming a lower-quality representation of the entire scene over stable WiFi and an enhanced-quality, direction-specific detail layer over a speedy network of visible light communications, earning the duo a Best Paper award at the ACM Multimedia Systems conference held two months ago in Istanbul.
"It was a great experience and I was very happy that on the last day they gave us this award and recognition," Chakareski said. "Mobile virtual reality is an application that requires a very high data rate and a low level of latency."
Chakareski and Khan's method, which uses their own system design and algorithms to maintain fragile connections over the light networks, is explained in their paper WiFi-VLC Dual Connectivity Streaming System for 6DOF Mutli-User Virtual Reality. They showed gains of up to 10 decibels in viewport quality. That's almost like watching a video in standard definition and then switching to an Imax theater, he said.
Right now the idea is based on mathematical theory and simulated using large servers, so the next step would be to prototype it, which could take several years. The researchers must find ways to shrink the graphics processing hardware and make sure photodiode transmitters are always in sight of headset-mounted receivers, even as users move around.
"After the conference, many people asked questions and they were interested in collaborating," Chakareski added.
"People expect a lot from virtual and augmented reality, in terms of the applications you can develop. So far there have been a variety of applications, but many of them focus on entertainment and gaming, training and education," he noted. A shift from social networking into virtual worlds could also inspire new applications.
The new method could be used in mission-critical systems, such as flight simulation. Chakareski said the U.S. Air Force has provided some of his funding and he has tested versions for drones. Current funding is through the National Science Foundation, which began a series of grants in 2017-2018 that were extended because of the COVID-19 pandemic. Healthcare is another possible application; he previously worked on virtual reality systems for people with low vision. Other applications could include first aid, environmental monitoring and even travel.
Chakareski also said he may consider commercializing this research through a start-up company someday.