Suhib AlHendi - ECE PhD Student of the Month - April 2026
Suhib Al-Hindi is a PhD researcher in high-performance computing at NJIT, under the supervision of Prof. Qing Liu. His research sits at the intersection of HPC and distributed systems, with a focus on data movement, system efficiency, and quality of service. He studies how large-scale systems read, write, and move data. His work includes data reduction and reconstruction methods that improve efficiency while preserving the data quality the user needs. He also studies system-level data engineering, with an emphasis on reducing time to usable data, improving resource management, and raising overall system responsiveness. Another part of Suhib’s research focuses on network-aware optimization. This includes adding support inside network infrastructure, such as programmable switches, so the system adapts data fidelity based on congestion and other runtime conditions.
Suhib’s broader goal is to optimize end-to-end data paths in distributed environments, so users get faster and more efficient access to data under both normal and constrained conditions. He is also interested in AI in HPC, especially in how AI-driven workloads shape system design and what impact they bring to data movement, resource efficiency, and scalable infrastructure.
What would you say could be the next big thing in your area of research?
I think the next major step in my area is data-centric HPC infrastructure for AI scale workloads. Data volume keeps growing, and progress now depends on faster processors. The system also needs to move, transform, store, and serve data with much higher efficiency. You can already see this shift in industry. NVIDIA now treats AI infrastructure as a full-stack problem that includes compute, networking, storage, and communication across systems. At the same time, recent work in large-scale AI shows that memory bandwidth, communication overhead, and data movement often limit performance as much as raw compute.
From an HPC point of view, this puts data engineering at the center of system design. The next advance will come from systems that use resources more efficiently, move data with lower overhead, and adapt better underload. For me, this means building scalable infrastructures where storage, memory, and the network work together to deliver data with the right speed and fidelity for the workload.

Your research group studies high-performance computing. With the ongoing AI boom, people say that computing power is the new gold. What is your opinion on the matter?
Computing power has become a strategic resource in the AI era. Access to large-scale computing shapes who leads in AI, scientific computing, and other data-intensive fields. The comparison to gold makes sense to a point. Strong computing infrastructure gives organizations and countries a real advantage in research, engineering, and economic growth.
Still, HPC shows why compute is not a simple commodity. Performance depends on architecture, interconnects, software, storage, and the workload itself. One unit of compute does not deliver the same value in every system. I do not expect computing to become a fully standardized commodity like oil. Access to computing is already tradable through cloud platforms, but its real value depends on how well the full system is designed and optimized.
There is a new concept called AI native thinking. Please share your experience of using AI for making your life and/or study more efficient.
My research does not focus on AI itself, but AI has become an important part of how I work. It helps me speed up literature review, information gathering, documentation, coding, and early-stage prototyping. I also use AI when I test ideas, review code, and debug problems. This cuts down the overhead in the early phases of research and lets me move through iterations faster. As a result, I spend more time on the core problem, the system design, and the evaluation of results.
For me, AI native thinking means treating AI as part of the workflow rather than as a separate tool. It helps with exploration, drafting, and feedback. The researcher still needs to verify correctness, judge quality, and make the final decisions. People who learn how to work with AI in a disciplined way will have an advantage in research and engineering. In my own work, AI helps me move faster and think through more options while keeping my attention on the problem that matters most.