Chicago this week began deploying sensors on light poles to monitor, photograph and listen to the city. The effort is costing as much as $7 million, and may be the largest urban data collection of its kind once all 500 nodes are in place.
The beehive-shaped nodes have an array of sensors with enough onboard computing capability to conduct data processing on the device and minimize the amount of bandwidth needed to transmit data.
Cameras will track the movement of pedestrians, vehicles and whether water is pooling on the street. Another camera will be pointed to the sky. A microphone will monitor noise levels. There will also be temperature, pressure, light and vibration sensors. Particle sensors will detect pollen. Gas sensors will check air quality, recording carbon monoxide, nitrogen dioxide, sulfur dioxide, and ozone. Even the magnetic field will be monitored.
Chicago is taking an Internet of Things (IoT) deployment to what may be a new level as it seeks insights into the city’s environment. Aside from sensors, each node will offer computing capability, including the use of Odroid Linux systems and a separate controller that will enable rebuilding and updating the operating system. And it includes a cellular modem.
Once the unit is placed 20 feet up on a pole, the city wants to do as much as remotely as possible.
The data will be publicly available through the OpenGrid.io portal once enough sensors are deployed later this fall. But Chicago’s CIO Brenna Berman, says the city doesn’t yet know how businesses, community groups and others will creatively use the data. “We can’t even begin to imagine what they are going to do with it,” she said.
But Berman has some clear ideas about how the city could use this data. It has an analytical team of 17 people comprising data scientists, business intelligence experts and database administrators, all of whom can use it for predictive analytics.
For instance, Chicago has relied on spot surveys to measure traffic and pedestrian flows. But the camera data, which may snap up to two photos per second, will enable the city to continuously track movement at intersections and analyze how to improve them.
The city has ample data on traffic accidents, said Berman, “but what we don’t have is information about accidents that almost happen or near misses.” That information will help improve safety, she said.
The project has been dubbed the “Array of Things,” an homage to the array of instruments that are combined in telescopes, said Charlie Catlett, a senior computer scientist at Argonne National Laboratory and the project’s principal investigator.
The National Science Foundation provided $3.1 million and about $2 million was spent on research and development to create the base platform. Cost sharing involing the city and industry partners brings the entire project investment into the $6-to-$7 million range, said Catlett. The entire installation will be completed in 2018.
The processing on the device means that data from the photos can be gathered and transmitted and the photo itself deleted. The monthly transmission, per device, is expected to be about one gigabyte over a cellular network, said Catlett.
The on-board processing also protects privacy, said Catlett, which was one of the concerns the project sought to address. Although all of the data will go to a cloud-based server, a sampling of photos used for baseline analysis will be sent a University of Chicago-based server. The university was awarded the NSF grant.
The device will shut down in extreme weather, although the heat generated by a four-core Arm processor and a Samsung processor used in cell phones will provide some protection in extreme cold, said Catlett.
Although cities are deploying sensors in urban environments, Catlett said he is unaware of anything as extensive as what’s going on in Chicago.
“I’ve yet to talk to anyone from any city who feels that they have adequate information about even the simple things,” said Catlett.