Smart City Platform Insights Based on GCTC Participant Feedback
Data | |
---|---|
Sectors | Data |
Contact | Scott Tousley |
Topics | |
- Authors
{{{summary}}}
The following are observations based on the overall raw data, conversations held during the GCTC event and Think Big Partners’ smart city industry expertise that helps put the feedback in context with the GCTC workshop goals of developing a framework for a city data platform. Note: It is not the goal of this white paper to make absolute recommendations or declare conclusions based on the feedback data.
However, it is the goal to advance the understanding for NIST as cities are developing platforms, using data for IoT smart city deployments and are trying to make sense of a wide variety of data in real time in order to most effectively make purchasing, operational and strategic decisions.
We offer the following observations and insights accordingly.
- Cities are all trying to make sense of the enormous complexity of the smart city world. This complexity is both technology-based and urbanization-based. The sheer volumes of people and the unforeseen or unanticipated consequences is stressing budgets, decision making capacity (cycle time and decisions based on insight or prior experience) and leadership in an era of high public scrutiny fueled by the media (including social media).
- Return on investment (ROI) is top of mind for many cities. Cities are asking for proof and real-life use cases with demonstrated ROI’s versus stated (advertised by manufacturers) ROI’s. Historically, there were not enough installations to demand actual ROI figures in certain sectors (e.g. LED lights) but that has changed. Where applicable, cities want to see real ROI experiences when making investment decisions. Lack of relevant ROI feedback (when applicable) creates a budgeting challenge.
- Cities must collaborate across public-private stakeholders to fully utilize data, find funding mechanisms to pay for assets and develop a framework that is robust and consider the future.
- Certain municipal departments and assets seem to garner the most attention. This may be due to:
- Historically demonstrated technologies (such as lighting, select public safety / crime, digital kiosks, Wi-Fi, surveillance, limited data analytics, limited traffic (to include parking and limited transportation), water, energy etc.) that offer ROI insight from “early adopters” which makes “fast followers” more likely to make investment decisions for smart city technologies.
- Sectors (such as water, public safety (including advanced detection and predictive analytics), energy, airport, sewer, etc.) that have some of the most dire needs based on problems, citizen perception or infrastructure deficiencies.
- Sectors that have federal grants associated with them (water, public safety, disaster preparedness, etc.) for smart city / IoT capital investment. This allows funds to be leveraged but also demands accountability (effectiveness) that can be obtained by data after installation.
- Cities need a roadmap that allows coordinated planning and investment across department silos. Without data, it is hard to contextualize and associate the problems with each other while still trying to measure ultimate ROI.
- Cities need an effective internal framework for communication. Developing a basic understanding of language (to include definitions), data expectations (including collection process, shortcomings and accuracy of insights) while adhering to privacy policies, required citizen transparency and Freedom of Information requests without compromise to public-private business partnerships or security (including cyber) considerations.
- Cities need to assess current technology systems that collect, manage and store data to make sure it is compatible with smart city requirements and capabilities.
- Cities must understand how to process, store, encrypt (where applicable) and make data accessible at the right levels (intra-department, interdepartmental and open data policies for the public). This applies to personally identifiable information (PII) and de-identified, aggregated data.
- There is not enough focus (education, knowledge and applied understanding) on how to pay for smart city deployments. Financial engineering is a major problem to be addressed and is expected to be partially addressed by the public-private partnership (P3) models.
- Cities need to have a more forward looking (progressive) view of data. Being able to move from descriptive and diagnostic data levels to predictive and prescriptive data levels is essential to ROI and maximization of data across departments.
- Cities should share data with other peer cities to establish benchmarks, insights and share lessons learned relative to technology selection, operations, implementation, compliance, risk management and product development roadmap for enhanced future functionalities.
- Cities must examine their long-range planning and procurement processes, in order to be more responsive and reduce risks associated with prolonged business cycles.
- Cities without public Wi-Fi are asking themselves what is their role in providing this to the public. Should Wi-Fi be an amenity that is used as a foundational layer for citizen services and quality of life? Or is this an expensive investment that should only be made as a necessary component to a city’s service deliver needs?
- Note - The issue of digital inclusion came up from various GCTC participants over the course of the two-day event.
- Cities feel the need to get universities more involved but are unsure how or the role they should play. What is that education can play in the civic technology environment?
- Cities want to have developers access the data (open data policy, developer portal, etc.) but are unsure how to protect the data and what role the city has in managing the developer.
- Cities feel the need to communicate with the public about smart city decisions, especially related to data, but are unsure how to accomplish this. Some of the concerns are based on knowledge, no prior policy in the IoT realm, public sentiment and misperceptions about “big brother” and not having a long term clear strategy due to the dynamic nature of IoT and smart city environment.
- During the sessions, problems were identified by use case and segment, however specific KPI’s and metrics were not widely discussed. We believe this was the function of many more basic questions existed and for many of the participants, KPI’s were too far out of reach beyond the obvious, high level metrics.
- The role of citizen sourced data came up in various forms. Some of the inferences suggested that this data would come from personal devices (smart phones, tablets, etc.,) but was not clearly addressed. Cities should be cognizant to recognize the various sources of data to establish a chain of custody and verify data integrity, authenticity, quality and permissions.
- The ability for cities to overlay multiple departments (example – police where warrant has been issued for individual and housing where there is a code violation for property owner) where different data for one common citizen (360-degree citizen view) would be very helpful.
- Data integrity is important to maintain when it comes from a battery powered source (sensor) where the data quality could be compromised due to low voltage or under-voltage versus no voltage (and no bad data being produced). The ability to monitor the health of the data collection process and hardware is important.
- Unique transient citizen data (example out of town convention visitors, etc.) should not be overlooked. Example would be hotel rooms or transit from airport speeds. This data set is an important part of economic development data but may not be seen in larger resident citizen data of a city.
- The ability to identify the type and trend of KPI’s based on outcomes may be as important as the KPI itself. An example of this would be in the vehicle arena. The type of KPI would be “reduction” (with decease in comparative data being the measurement) but the data attribute may be “energy” (lower energy usage) or “emissions” (lower pollution). The participants reported that they cannot even envision all the data that will become available but they seem to know what a positive trend would be by category.
- Sources of funding was tied to data collection and monetization. Understanding permissible use, value of data (and reasons for the degradation of the value of data) and ways to monetize non-standards assets (example access rights, pole rights, hanging rights, etc.) was important.
- GCTC participants asked about benchmarks and norms for funding models. Funding obviously has a direct correlation to ROI. There was concern that they did not want to strike a “bad deal” or do something beyond the boundaries (which are dynamic and evolving).
- Data sovereignty was identified as a “BIG” deal in non-US markets.
- The desire to seek and prefer open source data platforms was strong at the city level. Open source lends itself to interoperability and allows a large array of developers to continue to innovate on the functionalities of existing hardware and software while building for future needs.
- Automating the data collection process and using the proper data collection intervals for meaningful data was discussed. Being able to create routines that allowed human monitoring for exceptions was important.
- The sheer volume of data will require new communication and interpretation methods. There was a strong understanding that visualization should be highly customizable, with the ability to have user defined mutli-layers on demand, in addition to stored routines that produce standard data reporting.
- Predictive analytics was very important to GCTC participants, especially in the areas of crime.
- Being able to make changes from the city platform (control inputs) was discussed in order to drive out the need for multiple systems at the city level. Being able to rely on the same platform to both collect, understand and make changes to the connected infrastructure would reduce the chance for human error. It would also provide a potential risk management process that would not allow changes to be made without verifying against the recommended norms based on reported data, if outside of acceptable thresholds that have been set.
- Being able to manipulate the data to find “hidden correlations” is important. This data mining could find hidden costs that could be squeeze out through more complex modeling based on a series of disparate data sources from a single asset.
- Cities discussed having a single starting point of contact or “one stop shop” where citizens could go to understand the data and interact with a government official to ask more questions or gain access if needed. This could serve as the front door to multiple siloed departments.
- Being able to detect data or data patter abnormalities was viewed as important. Cities could use this to find faulty collection processes or methods related to hardware / firmware or software. This could also be used to discover breakthrough innovation opportunities.
- Cities should establish “goals” for data and then create the necessary supporting data and benchmarks to support goal attainment. This could be done by department, at a macro-level for desired trend (example – reduction in ____) or on project basis for experimentation and testing.
- Cities want department leaders and technicians to be able to communicate more effectively to collaborate on seemingly disconnected issues that are actually connected. Example used was being able to reduce public health / mortality rates that are touched by ambulance. Reductions in notification of emergency, transit to scene and transit to hospital may all be touched by various infrastructure components that data could reveal improvement opportunities. Weather, traffic, wayfinding, signal light synchronization and more all play a role but this data may exist in different departments. Leaders and technicians should look for opportunities to connect the data streams to make better, more simplified complex decisions in shorter amounts of time with better accuracy.
- Cities discussed that the term “smart city” may need to be framed and re-framed periodically with its citizens. Data can play a storytelling role to help explain the problems while also showing progress towards improvement.
- It was discussed that cities need to use a rolling planning cycle (suggested 5 year) that allows continual refresh of goals, knowledge and technology capabilities assessment versus budgeting and prioritization needs of the city. Very important to maintain KPI’s relevant to each deployment even if the technology is no longer a core focus. Historical initiative still need to be measured and reported on, especially in the case of grants.
- It was discussed that weighted scores may be used. These scores may need to be rebalanced from time to time.
- Implementation and education were additional challenges that need to be address. In order for a smart city deployment to be successful, education for line workers and staff members is needed. Implementation of both processes and data usage need to be addressed and education is critical.
- Cities discussed leveraging large IT companies for expertise as part of diligence and design-build process. Cross department consulting services could be done by external companies and internally by city stakeholders.
- Cyber-security was a recurring underlying theme for all cities as it pertains to data, control of devices and the actual operations of various assets. Cities may look to state and federal agencies for help with process improvements and risk management.
- Cities discussed the need to collaborate with other cities to stay ahead of threats, especially for targets that were extremely high risk or vulnerable (transportation, water supply, public health, etc.)
- Cities need to be able to connect smart city investment with the entire citizen population (inclusiveness) and also economic development. Specialized data interpretation may be need to make indirect correlations.