VROOM Brings AMIA 2023 Conference into the Center of Ontologies
You are here. This familiar phrase to guide finding a more expedient and less complicated direction can now be applied to the complex world of medical ontologies, thanks to research on a project led by the Ying Wu College of Computing’s Professor Jim Geller and Assistant Professor Margarita Vinnikov.
A paper on their Virtual Reality Ontology Object Manipulation (VROOM) system was recently presented at the 2023 American Medical Informatics Association (AMIA) Symposium in New Orleans, along with a demo of the solution. Geller, along with co-principal investigators Ph.D. candidate Navya Martin Kollapally M.S. ’20, and Vipina K. Keloth Ph.D. ’21, also presented a separate paper on Integrating Commercial and Social Determinants of Health: A Unified Ontology for Non-Clinical Determinants of Health. AMIA is the top medical informatics conference in the U.S., and devotes its efforts to collecting, analyzing and applying data directly to healthcare decisions.
In simple terms, an ontology is defined as a set of concepts and categories in a subject area or domain that shows their properties and the relations between them. Medical ontologies are systematic representations of knowledge that can be used to integrate and analyze large amounts of heterogeneous data, allowing precise classification of a patient.
Geller and his fellow NJIT colleague Professor Yehoshua Perl, credited together with pioneering the visualization and auditing of medical ontologies, have published 66 major papers since 1996 and are considered some of the main experts on organizing medical concepts for use in the healthcare industry.
Despite incorporating them into many functional areas, including insurance billing, ontologies are still complicated, Geller observed. “It all looks great but doesn’t scale when organizing 8,000 + concepts,” he said.
Protégé, a free, open source, single-plane based ontology editor and knowledge management system has been the primary technology platform for data scientists since its inception in 1987.
Geller likens organizing data to climbing a mountain. VROOM whisks you straight to the top and provides a full 360 view.
According to Vinnikov, an expert in virtual and augmented reality technology (VR/AR), VROOM offers a three-dimensional, editable and digestible visual display for the end user. It also greatly enhances the ability to find errors in ontology diagrams, which are now animated.
As Geller explained, “It lets you stand within the data” for a completely immersive experience.
Vinnikov and her team, including Veena Chaudhari M.S. ’22, Danielle Grunwald ’25 and various NJIT capstone teams and members of Vinnikov’s iXR Lab, have taken extremely large amounts of information and inserted it into nodes or “books.” Using commonly recognized objects and terms, users can collapse, expand and manipulate nodes, which are arranged according to a hierarchical naming convention of parent, children, grandparent, etc. Searching and jumping to a node far away is achieved using a magnifying glass; splicing and combining nodes can be done with a scissors and glue stick.
Sound commands have also been implemented, which alleviates the need to use hand controllers and expedites searches for various subjects, e.g., “COVID,” requests to “show lung,” and so forth. Vinnikov and her team are presently focusing on incorporating naturalistic gestures as well.
A future goal is to integrate a multiplayer function, allowing several people to enter the virtual space together, which Geller hopes to have by May 2025.
“Right now, VR is a lonely experience. It’s one reason why it is not more popular,” Vinnikov remarked. “We learn from each other in the moment by collaborating as one. This will reduce the need for challenging questions and thoughts through multiple back and forth phone calls.”
Ontologies can be complicated to navigate. VROOM – through Geller and Vinnikov’s vision – makes it “easy” and “fun” to get you there.