From Brain Mapping to Drone Swarms, AI Connect Minds and Challenges Makers at NCE
Engineering has been called "the art of the possible," so it's no wonder that engineers of all types are investigating the possibilities offered by artificial intelligence. At New Jersey Institute of Technology's AI Exploration Day, an all-day and campus-wide event held on March 26, presenters from the university's Newark College of Engineering (NCE) offered workshops and demonstrations showing how engineers are using AI for research at NJIT, and how AI's computing power helps to address longstanding challenges and explore novel solutions across industries.
From the discovery of new materials and medicines, to data analysis and automation, AI in engineering appeared in dozens of presentations by NCE faculty and students, revealing AI applications in healthcare and medicine, manufacturing, construction, agriculture, transportation and education.
Some sessions demonstrated AI's use in human movement analysis, where it informs rehabilitation strategies and helps doctors create personalized medical treatments for injuries. Others discussed how AI guides robots on construction sites to improve productivity and safety; how AI models "learn" the constraints of real-world physics; and how AI can design new battery materials to improve energy efficiency and support sustainable technologies.
For those who wanted to see examples of AI in action, live demonstrations at the "Living Lab" exhibition showcased AI-powered technologies, including drone swarms; a jointed robotic arm controlled remotely by gestures with a handheld controller; and a robotic "dog" whose uncannily doglike movements were controlled through a virtual reality interface. AI helps these technologies "learn" from their human operators and perform autonomously. It enables drone swarms to work collaboratively, and allows robots to explore locations that are risky for humans to enter.
"The bridge between seeing demos like this in events, and things like this entering your household — that gap is very small," said Kasthuri Jayarajah, presenter of the robot dog and an NJIT assistant professor of computer science. "The hardware has constantly become cheaper, and at the same time, the processing methods are also becoming more streamlined."
Presenter Arnob Ghosh discusses cyber-physical interfaces.
Arnob Ghosh, an assistant professor of electrical and computer engineering, hosted a workshop on intelligent cyber-physical interfaces at NJIT, explaining how so-called cyber systems — AI models and other forms of computing — work together with physical components such as machinery, robotics, the environment and human activities. Examples of these integrated systems include power grids, 5G networks, smart buildings, medical devices and industrial robots, to name a few.
In past decades, new technologies enhanced cyber-physical systems by making them more accurate, connected or efficient. "They improved the building blocks," Ghosh says. By comparison, "AI changes the decision layer itself," he explains. "It changes not just what the system can do, but how it decides what to do."
Traditionally in such systems, fixed rules dictate how they function. But because AI can make decisions in milliseconds and adapt quickly to changing circumstances, it can make cyber-physical systems more adaptable, resilient and autonomous.
"AI can reduce energy use, improve scheduling, detect faults early, minimize waste, and optimize resource allocation," Ghosh says. "In domains like buildings, transportation, and energy networks, the practical impact could be huge."
Neuroscience has also been greatly improved by AI, says Xin Di, an NJIT research professor of biomedical engineering and a presenter for a session on AI-assisted brain mapping, imaging and connectivity research.
"For a long time, progress in brain imaging was mostly about building a better camera," Di says. "But even with the best pictures, the data is incredibly 'noisy.' AI changes the game because it isn’t just a sharper lens; it’s more like a translator. It can sift through the static to find the hidden patterns that connect brain activity to how we actually act and feel."
Xin Di uses AI to detect subtle electrical signals in the brain, which are often undetectable by other methods.
Co-presenters at the neuroscience session integrate AI with tech such as functional magnetic resonance imaging (fMRI) and transcranial magnetic stimulation (TMS). Their tools include NeuroSTORM, a large-scale foundation model for fMRI data analysis, and BrainATCL, an AI framework for analyzing certain types of connectivity patterns in the brain.
In his own research, Di uses AI to detect subtle electrical activity that traditional methods often miss. "In one project, we use AI as a 'digital film critic.' While participants watch movies in a scanner, the AI labels the scenes in real-time, allowing us to see exactly how the brain reacts to complex, real-world stories."
Another way that brain researchers use AI is to guide robotic limbs that deliver magnetic brain stimulation, says Elisa Kallioniemi, an assistant professor of biomedical engineering and co-organizer of the presentation.
"The AI learns each person's brain in real time and finds the optimal target," Kallioniemi says. "It is faster, more precise, and more reproducible than a human. The drawback is the significant time required to build and rigorously validate these systems."
Indeed, there are many reasons for exercising caution around AI. In cyber-physical systems, mistakes in an AI model can have immediate real-world consequences, leading to injuries, equipment damage or infrastructure failure. "When a system controls physical processes, engineers need to know why it acted and whether it will remain safe," Ghosh says. "Many AI models are still difficult to verify, certify, or debug rigorously."
Another area of concern is that AI is a "shortcut artist," says Di. "It is so powerful at finding patterns that it sometimes finds the wrong ones — like mistaking background noise from a scanner for an actual brain signal. It still needs a human 'guide' to ensure it’s looking at the science and not just the static."
In neuroscience, Kallioniemi adds, "Our goal must be human-AI collaboration, not blind AI automation."
NJIT seniors Marcus Jerome, Nicholas Colaco and Ryan Adhikari presented "Drone Swarm for Search and Rescue Operations."
In addition to the workshops and panels, more than 100 NJIT students also presented AI-powered research projects at a student showcase in the university's Wellness and Events Center. Their work highlighted diverse uses for AI, including gesture translation for sign language, the study of solar activity, drone deployment, disease detection and the treatment of medical conditions such as tinnitus and glaucoma.
Karen Iskander, a senior biomedical engineering and Albert Dorman Honors College student, demonstrated a prototype for "A(EYE) Assistive Glasses," a wearable navigation system for people with visual impairment. An AI algorithm called YOLO detects nearby objects, analyzes them to identify potential obstacles and communicates warnings to the wearer.
"Over 2.2 billion people are visually impaired, one billion of which don't have access to assistive technology," Iskander says. "We aim to bridge that gap."
Seniors Ryan Adhikari, Marcus Jerome and Nicholas Colaco (Adhikari and Jerome are electrical engineering majors, and Colaco is majoring in computer engineering) presented "Drone Swarm for Search and Rescue Operations," in which autonomous drones look for people in need of assistance in disaster zones, where conditions are often dangerous for human aid workers. One drone acts as the leader while the others inspect designated areas, expanding the amount of ground they can cover. If one of the drones fails, the others can still continue their work.
"Whatever our drones see is sent back to the main drone, which then processes through our large language model and then sends us data if it detects a person or not," Adhikari explains.
AI is even helping to unravel the mysteries of the human brain, says Michael Glassen '28, a PhD candidate in biomedical engineering. Glassen's work uses AI to analyze data in brain scans from people who experienced strokes, looking for patterns that might predict the scope of their recovery. In an analysis of 500 paired brain regions, AI identified 16 as "strongly predictive" for stroke recovery — a finding that could have important clinical uses, Glassen says. For people who recently had a stroke, AI analysis of their brain scans could inform doctors if a patient might benefit from more intensive therapies as they heal over the first four months, which is a critical timeframe for stroke recovery.
"AI is really good for finding things that doctors wouldn't be able to see with regular tests, or just by looking at data," Glassen adds. "It's really able to tease out the needle from the haystack."