With NSF Grant, NJIT Engineers Gaze in Robotic Eyes, Envision Microchips There
Machine vision sensors could work faster and more efficiently if they were designed like biological eyes, according to Associate Professor Dong-Kyun Ko, at New Jersey Institute of Technology’s Electrical and Computer Engineering department.
Ko is principal investigator on a $467,000 National Science Foundation grant, Infrared Retinomorphic Vision. His co-principal investigator is Assistant Professor Shaahin Angizi who specializes in non-traditional circuit architecture and AI-aided design.
Modern machine vision systems, whether in your Tesla or a factory robot, generally incorporate optical image sensors. The sensors convert analog electrical data into virtual snapshots of visual information, which is then shuttled to digital back-end for image processing, one frame at a time. This causes bottlenecks, so the new idea from Ko and his colleagues is to make optical sensors that can perform both sensing and data processing, rather than bogging down a main processor. Sensors would essentially tell processors what they see, whether it’s a road sign or factory worker, rather than merely telling the processor that some unspecified object is there to be determined.
“It reduces the amount of visual information that needs to be computed by the brain, making the overall process faster and more energy-efficient,” Ko said, adding that pre-processing is just one task which biological eyes conduct in addition to dark adaptation, light adaptation and motion detection.
AI will gather all the information from its surroundings, and process it to make a decision. That's what AI does — it learns and tries to make the best decision as fast as possible, and I believe visual information is the key
Currently, “We have simulated the device so that it functions as a retinomorphic sensor. This project builds on this foundation, and now we're going to try to fabricate the device,” Ko stated. “We'll start with a simple thing and grow to an actual system. So you start with one single pixel. And then we're going to try to make a 2-by-2, a 6-by-6 and up to a 16-by-16 array of these devices. When you make a 2-dimensional array, it becomes an imaging chip. But in a more technical term we call this a focal plane array.”
The new sensors will be based on a material called polycrystalline lead selenide, a semiconductor widely used for low-cost, uncooled systems. It’ll search for mid-infrared wavelengths, empowering the machine vision system to see through fog, mist and smoke. Such benefits are why mid-infrared sensors are widely used for first responders, the intelligence community, law enforcement and military, Ko said. Lead selenide also has a property that allows optical sensors to operate at room temperature without cryogenic cooling.
As nature’s machines survived by being fittest, “I think that’s the future of AI,” Ko stated. “AI will gather all the information from its surroundings, and process it to make a decision. That's what AI does — it learns and tries to make the best decision as fast as possible, and I believe visual information is the key. And I think that's also true for all the other biological systems, too. About 80% of the information we biological organisms gather from the surrounding environment comes from visual sources and it plays a vital role in our survival.”
“For AI, I foresee that visual information is the bottleneck for all information processing, considering the colossal amount of real-time information that needs to be processed, unlike temperature or pressure numerical values. And that's why I think retinomorphic vision sensors, that serve as an electronic eye for future AI, is important research.”