This is the fourth in an article series based on Stephen Goldsmith’s paper “Digital Transformations: Wiring the Responsive City.” Click here to read the report in full.

When I started writing about city services 15 years ago, infrastructure innovations included better pavement processes, design-build approaches, and improved labor practices and productivity. All these continue, but they have partly been supplanted by a new approach to infrastructure: asset optimization.

Today’s innovative city leaders must maintain and build infrastructure, but such things as moving people and traffic, and processing water, now involve analyzing massive amounts of information to derive the best use for critical public assets. In this chapter, I examine digital breakthroughs in repairing equipment before it breaks, moving people more effectively with analytics, and using smart grids to conserve energy.

EXTENDING ASSET LIFE THROUGH DATA

Data pours out from all over a city—from residents, smartphones, repair orders, and assets themselves. Government technicians see data from flow meters embedded in sewer and water pipes, cameras that monitor bridges, and sensors embedded in streets. Administrators now face a new challenge: how to integrate the data to expand the life and capacity of municipal infrastructure. How does one interpret information to know when, say, a bridge will ice up or begin to develop expensive stress fractures, or when a roadbed will deteriorate so badly that it needs total resurfacing?

In 2007, an I-35W bridge in downtown Minneapolis collapsed during rush hour, dropping 111 vehicles into the Mississippi River. Thirteen people were killed, and 145 injured. When Minnesota officials built a replacement bridge, they ensured its ongoing safety by embedding sensors throughout the structure. During the bridge’s construction, concrete maturity meters and thermal sensors allowed the contractor to produce higher-quality concrete. The bridge’s 323 sensors and gauges now track structural health over time and enable comprehensive, ongoing safety management by the Minnesota Department of Transportation, the Federal Highway Administration, and the University of Minnesota. In addition, temperature, humidity, and wind sensors along the bridge trigger the preemptive spraying of deicing chemicals. The bridge has additional sensors to monitor traffic.

SMART WATER

Faced with enormously expensive federal mandates and continuing local pressures, officials across the country, forced to raise water rates almost annually, have increasingly turned to sophisticated approaches to maintain and enhance their systems. Often in partnership with leading private providers, these officials use technology to make more precise decisions about how to maintain systems, prevent problems before they occur, and expand capacity.

For example, the very well-run Milwaukee Metropolitan Sewerage District (MMSD) prioritizes maintenance work and reduces the risk to critical assets, through data collection and a ranking system that considers the asset’s criticality. The district uses extensive remote sensor information, including sewer levels, gate positions, water quality, weather (rain and wind), and pump-operation status to better maintain valuable assets. MMSD can now anticipate serious problems, and by integrating the sensor information with a sophisticated maintenance-scheduling system, officials adjust maintenance to better protect MMSD’s collection-treatment assets. The mining and evaluation of data from multiple points moves the department from organizing work orders on problems (after they become serious enough to visually manifest themselves) to instead targeting resources most effectively.

Smaller communities can also avail themselves of such an approach. Gresham, Oregon used such a data-driven approach to recognize that a blower pressurizing gas in a methane power cogeneration plant posed the plant’s greatest maintenance risk, which allowed water officials to develop an optimal maintenance plan for the asset. The more that data can be organized and mined—including bringing information together from different programs—the better the results. In Vancouver, Washington, each work order that the utility generates now has accurate and reliable asset-failure information associated with it. Establishing the framework for connecting these data points enables utility staff to easily and quickly identify assets on the brink of failure; thus, thousands of work orders created during the year form an information-based case for additional investment and asset protection.

Unsafe or overweight vehicles impose a cost on fellow motorists, emergency responders, and taxpayers. Enforcement traditionally relies on two imprecise methods: visual checks by safety officers driving the roadways; and inspections at weigh stations (the latter imposes costs on good operators, too). New Mexico’s Smart Roadside Inspection System identifies high-risk trucks, without impeding the flow of commerce, by integrating roadside imaging systems with multiple data networks. Cameras and character-recognition software capture the vehicle’s license and DOT number. A thermal detector images to find unsafe equipment. Officials then compare the information with an array of national and state criminal-justice, tax, registration, and motor-vehicle databases to screen for noncompliant operators. In 2011, more than 3.5 million alerts were generated in the following categories: 54 percent tax, 44 percent safety, 1.5 percent overweight, and 0.5 percent crime-related. By automatically identifying high-risk trucks from the roadside, New Mexico’s approach allows inspectors to focus their resources on trucks that pose the most risk to transportation safety and security.

USING DATA TO INCREASE CAPACITY

Few items cost cities and states more than highway pavement. Today’s breakthrough challenge involves how to use data to drive more efficient use of public goods. For example, NYC collects increasing amounts of traffic data and incorporates it into traffic management. The degree of change that is possible became clear to me in a discussion with former New York transportation commissioner Janette Sadik-Khan, as she showed me her traffic-control room. Inside this room, filled with an impressive array of video and digital monitors, city engineers worked to make dynamic and real-time adjustments to traffic lights, thereby improving traffic flow. Midtown in Motion, an innovation of NYC DOT, monitors systemwide traffic patterns to set stoplights and move traffic. City officials, as of the beginning of the program in 2011, used more than 100 sensors at 23 intersections to find “congestion choke points,” helping reduce delays. According to NYC DOT, the program reduced traffic times in the areas studied by 10 percent or more (with the city planning to expand the program as a result).

Each year, more than 200 million vehicles clog the New Jersey Turnpike. But the Turnpike Authority has begun to build out the architecture for predicting traffic jams before they occur. Contractors are installing “puck” sensors in the pavement at one- or two-mile intervals on the turnpike and every four miles on the Garden State Parkway. When complete, the sensors will detect traffic volume, lane occupancy, and speed in real time. Algorithms applied to this data will predict congestion; accordingly, traffic managers will route cars to the turnpike’s inner or outer roadway to prevent congestion. This information will also allow transportation officials to avoid secondary accidents by signaling slowdowns to truck drivers. Barry Pelletteri, CIO of the New Jersey Turnpike Authority, emphasized that to successfully manage a high-speed roadway, “you need the detection sensors, you need the strong network, you need the algorithm, and you need the software.” On the New Jersey Turnpike, you have “too many vehicles, too much speed, and too much roadway. We can’t keep up with it all unless we can anticipate, and the only way to do that is with the data.”

ADJUSTING CONSUMPTION BEHAVIORS THROUGH PRICING

Harvesting data to better maintain infrastructure, or to help officials manage flow as suggested above, drives great value. Another approach involves organizing data and connecting the information to residents to change user behaviors.

Before dynamic pricing began in Virginia in November 2012, the state’s average commute time was 26.5 minutes, the seventh-highest in the country. Traffic congestion cost residents the equivalent of one full week a year. The state concluded that about 10 percent of all morning rush-hour vehicles are occupied by people on nonessential trips, such as shopping or personal errands. In the afternoon, that figure increased to almost 30 percent. As a result, Virginia began using a variety of demand-based tactics, including tolls on the Downtown-Midtown Tunnel that vary by time of day, as well as High-Occupancy/Toll (HOT) lanes on I-495, outside Washington, D.C., that can be used by high-occupancy vehicles or those that pay a variable toll. Pricing the use of valuable and limited resources such as faster lanes can, in effect, increase the supply of critical resources such as roadways. The system piggybacks off existing EZ-Pass transponders and toll-collection technology—making implementation easier and reducing the costs of setting up monitoring and billing systems.

These initiatives present difficult political obstacles, since, by definition, they shuffle winners and apparent losers. Even more difficult obstacles occur when tolls go up to affect congestion, but the beneficiaries of those new tolls are another class of commuters. For example, when former New York mayor Michael Bloomberg undertook his congestion-pricing initiative, his goal was not only to ease congestion but also to raise money for mass transit. To a degree, this decision created diverging interests between those paying the tolls and the Metropolitan Transit Authority riders who would reap the financial benefits. And even MTA bus and subway riders remained unconvinced that the extra money would enhance services. Thus, a sound proposal to use data to affect consumption ran into a buzz saw of more parochial interests.

Good data needs to be presented inside a framework that produces allies. For example, Virginia’s dynamic pricing program is producing less revenue than anticipated because motorists are either driving less or at different times, instead of paying to avoid congested roadways during high-demand hours. Obviously, this result could be characterized as a success, if judged from a congestion-reduction, rather than a revenue, perspective. Similarly, London’s decision to include environmental mitigation as a goal in its congestion-pricing efforts may not reduce emissions as efficiently as a more exact, granular system that varies charges based on cars’ true environmental impact. London’s current pricing scheme, rather, exempts ultralow-emissions vehicles, plug-in hybrids, and electric cars, for example. Combining multiple objectives makes the data usage much more complicated in proving or disproving value. London’s congestion-charge revenues, while limited, have been leveraged by Transportation for London, a quasi-governmental entity that controls roads, to issue bonds for road and transit improvement, planned to total £3.1 billion. If this bond issue is successful, it may be one of the more significant positive impacts of the congestion charge, providing much-needed cash to develop the city while tax revenues and government expenditures continue to shrink.

Milan’s Area C congestion charge is levied primarily as an environmental initiative, with four levels of payments based on a car’s emissions. Over time, residents purchased more lower-emissions vehicles to capture free entrance to downtown, which again increased congestion, if not pollution. The Area C program has since emphasized congestion control by doing away with the scaled payments for all except hybrid and electric vehicles. The fact that these cities continue to modify charges to induce certain outcomes could be thought of as a success because it demonstrates how pricing and data can be used to achieve public goals.

CONSERVING RESOURCES THROUGH BETTER DATA

Across the United States, water and electric utilities are replacing older meters that require monthly visits from meter readers, with online ones that produce a continuing stream of important data. Dubuque, Iowa’s “smart city” initiative, Smarter Sustainable Dubuque, began in 2009 with a partnership with IBM. The program expanded from Smarter Water to include Smarter Electricity, Smarter Travel, Smarter Discards, and Smarter Health and Wellness. The water program equipped 300 volunteer households with smart water meters and access to an online dashboard, coupled with leak-detection monitoring, community education, and incentive programs. In the first year, participating households used 6.6 percent less water and detected eight times as many leaks. The next effort, Smarter Electricity, involved 1,200 households and used similar incentive programs to promote a reduction in energy use. The pilot program resulted in 4 percent less monthly energy use.

A different form of integrating data and consumption occurs when consumers grant a provider the right to monitor their data to make decisions on their behalf. A group of Duke Energy customers participate in a program that allows the utility to remotely turn off air conditioners at these 230,000 households, which receive $25–$35 annually to participate. Duke reduces power only incrementally, and for short periods of time, to save energy without causing such disruption that participants would opt out of the program.

More information allows residents to make better decisions. One key opportunity for public officials to produce value occurs when they find new ways to generate consumer-usable digital information. In 2010 and 2011, the San Francisco Municipal Transportation Agency (SFMTA), with federal and private partners, installed parking sensors in more than 8,000 on-street parking spaces. The innovative pilot program, called SFpark, makes parking easier, public transit faster, biking and walking safer, and commercial areas more vibrant. Between sensors, meters, and a demand-responsive pricing model, the pilot lets drivers know where they can find open parking spaces at meters and in garages in seven neighborhoods, through an app available via the web or smartphone. The SFMTA uses occupancy data from the sensors to adjust parking prices with the goals of having at least one free parking space available on each block and ensuring that garages never fill. For example, on August 11, 2013, rates decreased by $0.25 or $0.50, depending on the location and occupancy rate of the parking space, during 18 percent of metered hours. It made no change for 62 percent of metered hours, and increased rates by $0.25 during 20 percent of metered hours. Parking rates may vary by block, time of day, or day of the week, and rates increase by either $5 or $7 per hour during special events around the baseball stadium. Although the evaluation of the pilot had not been released by press time, extensive online data is available on the meter-rate adjustments, including the pilot area, street, block, time of day, previous rate, and current rate.

SFMTA reports that an unprecedented data set is being established, with parking (collected through sensors, meters, and citations), garage, municipal (travel time and transit-vehicle data), parking tax, and survey data. According to Sonali Bose, San Francisco Municipal Transportation Agency’s chief financial officer, “After evaluating the SFpark pilot project, the SFMTA will use lessons learned to develop a proposal for expanding the SFpark approach to the rest of San Francisco’s metered parking and city-owned parking garages. We expect that expanding demand-responsive pricing for parking will make it easier for drivers to find parking and improve the quality of life without any loss of parking revenues.” With all policies and data used to make rate changes available online and a new source code for SFpark apps and map uploaded in July 2013, this effort is truly transparent and has transformative potential.

Learning from Data Generated by Residents

Cities can also learn from another set of sensors: the customers of their transit and street systems. The Oyster card is a plastic smartcard used to pay fares on most forms of transportation in London. Customers using Oyster on the London Underground tap their cards upon entry and exit of the system to pay their fares and open the gates. More than 80 percent of public transport trips in the London network use Oyster, providing the public operator of the system, Transport for London (TfL), with millions of data points every day that can be used to understand the size and shape of customer demand on the transport network. Although users may register their Oyster cards to protect them against loss or theft, TfL encrypts card numbers as part of a process to make the travel data anonymous before analysis. Lauren Sager Weinstein, head of analytics for TfL’s Customer Experience Team, described Oyster as a powerful tool: “Before Oyster was introduced, TfL was reliant solely upon surveys asking passengers to report how they traveled. Surveys, by nature, have limited sample sizes and have a limit in terms of how frequently they can be undertaken. Oyster is a more cost-effective and powerful tool for understanding TfL’s customers and responding to their needs.”

In addition to providing an efficient way to count the number of customers across the network, Oyster data provides specific journey times for particular origin and destination pairs. “Through Oyster data, TfL can understand the ranges of times that customers take to make particular journeys across the network,” said Sager Weinstein. “This allows TfL to measure reliability of services over time and to measure the impact of the introduction of new services and timetable improvements.” Oyster data also helped TfL develop models to identify stations that would be hot-spots for congestion ahead of the 2012 London Olympics. Based on these findings, TfL rolled out a public messaging campaign encouraging customers to avoid certain stations at peak hours and also supplied extra trains to relieve anticipated congestion at targeted stations and lines. Subsequent analysis by TfL indicated that—while demand at stations serving Olympic venues surged 83.9 percent higher, on average—during the games (when the London Underground carried record-breaking numbers of customers), the transport network was able to accommodate this spike through the combination of increased capacity and targeted messaging about hot-spots. Oyster data showed a decreasein the “background” demand travel at stations typically serving commuters, with card entries and exits down 13 percent during the Olympics.

Boston secures perhaps more infrastructure information from its citizens than any other city. Promoted by a special mayoral office and initiative called the New Urban Mechanics, the city developed a smartphone app called Citizens Connect, which allows citizens to report infrastructure and other public-works needs (potholes, graffiti, etc.). Geo-tagged pictures provide more accurate information to city crews, or even to neighbors interested in common problems. Reporting by app provides a digital foundation for analysis. As of December 2012, Citizens Connect had been used to resolve more than 35,000 issues.

For a look at the evolving possibilities, observers can look to Santander, Spain, which outfitted its community with more than 15,000 sensors that measure everything from air quality and noise level to light intensity and traffic, with the ultimate goal of improving city services and quality of life. The SmartSantander initiative has transformed the city into an experimental platform to research applications and services for use in “smart cities” of the future. In 2010, Luis Muñoz, a University of Cantabria IT professor, secured—jointly, with Telefónica—a European Commission grant of 8 million euros to fund the initiative. The regional government of Cantabria provided an additional 500,000 euros to pay for half the cost of purchasing the sensors (the other half of purchase and deployment costs was funded by the E.C. grant). Four major, fixed-sensor initiatives now include environmental monitoring, traffic, energy, and irrigation pilots. The energy pilot program started with an intelligent lighting program that dims lights by 30 percent of their voltage when they do not sense any nearby movement. On a rainy evening, when fewer pedestrians and cyclists use city parks, the intelligent lighting system can reduce electricity use by 30–40 percent. Muñoz predicts that expanding this pilot could reduce city lighting costs by 20–25 percent. The next two pilots, scheduled to start in the coming year, involve water and waste. For the latter, waste bins on the city streets will be equipped with sensors to measure fill level. The city will monitor this data to plan routes by garbage and recycling trucks, reducing trips, traffic, and emissions.

Increasing demands on urban infrastructure require city officials to find new ways to both keep it in good repair and engage citizens in a way that encourages efficient use. City officials possess enormous amounts of data that they can use to make better decisions about how to maintain and extend the life and use of infrastructure. Data, especially the combination of information inside government combined with data harvested from interested residents, creates an opportunity for scarce resources to be used far more efficiently. The initial successes highlighted here will soon become mainstream, providing commuters and taxpayers with new opportunities in changing the behavior of consumers and public workers alike.

[divider] [/divider]

Originally posted at Data-Smart City Solutions.