Researchers in Texas are developing a new tool that uses artificial intelligence (AI) to recognize objects from raw traffic camera footage, and then characterize how those objects move and interact.
The combined research team from the Texas Advanced Computing Center (TACC) and the Center for Transportation Research at the University of Texas at Austin, together with the City of Austin, is working to develop tools that allow sophisticated, searchable traffic analyses using deep learning and data mining techniques. The team has developed a new deep learning tool that uses raw traffic camera footage from City of Austin cameras to recognize objects, such as people, cars, buses, trucks, bicycles, motorcycles and traffic lights, and then details how they react with each other. This information can then be analyzed and queried by traffic engineers and officials to determine, for instance, how many cars drive the wrong way down a one-way street.
The algorithm they developed for traffic analysis automatically labels all potential objects from the raw data, tracks objects by comparing them with other previously recognized objects, and compares the outputs from each frame to uncover relationships among the objects. Once researchers had developed a system capable of labeling, tracking and analyzing traffic, they applied it to two practical examples: counting how many moving vehicles traveled down a road; and identifying close encounters between vehicles and pedestrians.
The system automatically counted vehicles in a 10-minute video clip, and preliminary results showed that their tool was 95% accurate overall.
In the case of potential close encounters, researchers could automatically identify a number of cases where vehicles and pedestrians were in close proximity. None of these represented real-life dangers, but they demonstrated how the system discovers dangerous locations without human intervention. The researchers plan to explore how automation can facilitate other safety-related analyses, such as identifying locations where pedestrians cross busy streets outside of designated walkways, understanding how drivers react to different types of pedestrian-yield signage, and quantifying how far pedestrians are willing to walk in order to use a walkway. The project shows how AI technologies can greatly reduce the effort involved in analyzing video data, and provide actionable information for decision makers.
“We are hoping to develop a flexible and efficient system to aid traffic researchers and decision-makers for dynamic, real-life analysis needs,” said Weijia Xu, a research scientist at TACC. “We don’t want to build a turnkey solution for a single, specific problem. We want to explore means that may be helpful for a number of analytical needs, even those that may pop up in the future.”