Model Cars With Human Drivers Take to the Streets of a Miniaturized Newark
In an NJIT robotics lab spilling over with 3D-printed parts, engineering students are gutting toy trucks, SUVs, sedans and two-door Mini Coopers and refitting them with custom-designed systems: laser-cut side mirrors, wheels that can parallel park and a braking system that employs algorithms to control an electric motor, thus enabling soft and hard breaking, idling and taxiing.
Now tooling around Cong Wang’s Control Automation Robotics Lab, where they are guided by remote drivers at gaming-style steering wheel and pedal control stations, the cars will soon be deployed on the streets of a miniaturized Newark, N.J., where they will share the right of way with autonomous cars controlled by computers and remotely operated pedestrians. Three tiny video cameras, facing in different directions, sit in the driver position of each remotely steered car and a LIDAR scanning range finder spins on the roof to continuously survey the surroundings so drivers can respond to conditions on the road.
The model city, with its signalized intersections, access ramps and mix of roads and highways, is a novel assessment platform designed by Jo Young Lee, an associate professor of civil engineering, to evaluate the impacts of connected and automated vehicles (CAVs) on drivers, passengers in autonomous cars and pedestrians. Through crowdsourced experiments conducted over the internet, test participants will play each of these parts using virtual reality interfaces so that Lee and his collaborators, including Wang, an assistant professor of electrical and computer engineering, and Guiling “Grace” Wang, a professor of computer science, can evaluate their responses.
“Autonomous cars pose challenges since they are programmed to respond to general rules of the road, while drivers and pedestrians don’t strictly follow them,” notes Nishaant Goswamy, a junior majoring in computer engineering who spent last summer in Cong Wang’s lab working on the camera vision system, among other automotive elements.
“Sometimes people change lanes without signaling or don’t come to a complete stop at a stop sign. Pedestrians jaywalk and may expect drivers to stop for them,” he adds. “Using this platform, we will study the drivers’ responses to complicated driving situations, including the number of lane changes they make, the driving distance between the cars, and obedience to traffic laws such as stop, yield and speed limit signs.”
While the goal of CAV technology is to make driving safer and more efficient, there is still little information on human responses to these cars, team members say. Without understanding their sense of safety and comfort, as well as their physical reactions, such as steering and braking, it will be difficult to deploy them on the road.
Existing evaluations depend heavily on computer simulations, which researchers say can’t fully capture reactions. Using crowdsourced cyber-physical reality, which relies on visual and force feedback from human subjects, they aim to create a more realistic test. They will measure behavioral reactions, such as steering maneuvers and acceleration or deceleration, and are developing methods to assess emotional responses such as safety awareness and degree of comfort.
For the students, the challenge is to create sophisticated systems that mimic a real driving experience that will fit into cars as small as 9 inches long. Many of their efforts went through several iterations. Before arriving at the three-camera system, for example, Goswamy says he helped design a prototype of a camera placed at the focal point of a cone mirror that would reflect a 360-degree circular image of the car’s surroundings. He then used algorithms to unwrap the image into a distortion-free, wide-scale panoramic view that would display the front windshield, side views and back windshield.
“This was a novel achievement, because we had to 3D-print the cone-shape mirror and develop new algorithms to undistort the image to adjust it to the scale of the car,” he says. Ultimately, they decided not to use it, however. While using one camera allowed them to capture images speedily, the picture was ultimately too distorted. They had to develop special image-compression methods for the three-camera replacement to speed up the streaming.
“For the LIDAR, we had to create the smallest device possible and it took us three generations,” says Carlos Maranon, a sophomore majoring in electrical engineering who helped design and program it. He also coded the vehicle’s feedback system so that data it collected — speed, LIDAR position, distance readings and camera feedback — would appear as a graph on the main display screen viewed by the remote driver.
Wang’s students are working on an onboard GPS tracking unit.
“This project is highly interdisciplinary; it draws on skills from several different areas, such as transportation engineering, computing, networking and robotics,” he notes.
“The students are very creative and willing to try out a lot of different ideas. They don’t all work, but even the failures help us move ahead. They also come up with some quite novel ideas that inspire the team and become part of the design.”