Data
|
|
|
Sectors |
Data
|
Contact |
Wilfred Pinfold
|
Topics |
|
Activities
|
|
BigClouT
|
BigClouT project aims at giving an analytic capability to cities exploiting available big data from sources such as IoT devices, open data, social networks, mobile applications, etc. and use them to improve the daily life of cities, their citizens and visitors. The target applications are:
- Measuring the economic impact of large events organized in the city to the local economy, providing customized recommendations to the visitors (shopping, restaurants, sightseeing, etc.)
- Improving the mobility of the citizens and visitors during important events such as big congresses, festivals, Olympic Games, etc.
- Deployments and replications in 4 pilot cities in Europe and in Japan
|
|
|
|
Flood Abatement
|
Predict Flooding & provide safety to human life, Current Project in Dunedin, NZ
- Major flood event every year cause large amounts of residential and commercial damage
- A contributing factor to the flooding was blocked storm drains
- City responded by engaging a contract company to inspect and clean storm drains
- City has over 8000 storm drains within the city limits
|
|
|
|
Integrated Analytics and Scheduling of Emergency Responders Under Uncertainty
|
This project will use historical data to create models for various safety and emergency incidents across metropolitan Nashville, establish causes, and use this information to identify appropriate equipment requirements in different situations. This analysis will be combined with historical traffic and delay information to ensure the emergency vehicles are distributed at optimal locations and proactively maintained. During an incident a real-time decision support system will guide vehicle dispatch.
|
|
|
|
Microclimate Prediction for Willamette Valley Vineyards
|
Leveraging regional weather data and weather stations at individual vineyards to develop a regional prediction for when bud break and bloom will happen as well as highly specific predictions of those same dates for individual vineyards. Additional opportunities to predict and develop alerts for freezes, powdery mildew, and other events targeted at specific vineyards.
|
|
|
|
StormSense
|
Objectives
- Apply modeling to address multiple-flood types to determine the probable areas at risk by utilizing fixed sensors, crowd-sourced data collection verified by post-flood analysis.
- Use new state-of-the-art high resolution hydrodynamic models driven with atmospheric model weather predictions to forecast flooding from storm surge, rain, and tides at the street-level scale to improve disaster preparedness.
|
|
|
|
Supercomputer Modeling and Artificial Intelligence Cluster for Smart Cities and Regions
|
National and International Innovation eco-system to serve challenges by using high-performance supercomputers to utilize existing and create novel Artificial Intelligence to develop Smart Cities and Region services, including resilience solutions (Net Zero Energy Buildings and Districts, Carbon Neutrality, Advanced Energy Penetration, Energy Efficiency, Advanced Mobility Solutions, Environmental Health, Public Safety, Disaster Management, V2X Electrification, Climate and Water).
This Cluster will deploy both proven, and newly devised, large-scale Smart City and Region supercomputing predictive analytics to model regional pathways into resiliency with identification of scientific and marketplace solutions to provision and deploy the right Smart City and Region solution stacks.
AI models will include the next generation of Artificial Intelligence, including Complex Systems Simulations, Emulation, and Optimization in order to create highly efficient and robust “autonomous systems” and portfolio-scale analytics and automated services. The predictive analytics developed by Innovation Corridor’s national lab partners will be part of the initial portfolio and baseline.
The Action Cluster will develop a repository of Artificial Intelligence technologies and architectures from founders (Innovation Corridor and Powering IoT) together with eco-system partners in order to fast track the next generation of automated, resilient, cyber secure, environmentally friendly and intelligence-driven distributed systems for Smart Cities and Smart Regions. In addition, new forms of Artificial Intelligence technologies and systems will be developed within this Action Cluster such as a "Personal Digital Twin" - a personal digital avatar that will purposefully and mindfully serves each citizen within a society.
Large-scale compute intensive data sets will train Artificial Intelligence solutions prior to deployment, which will be sourced from a range of partners, jurisdictions and localities and used to refine our Cluster Artificial Intelligence predictive analytics. An eco-system of cities, regions, federal research labs, private sector technology companies and universities provide a unique framework and sandbox to support the deployment of Smart Cities enabling technologies and the provision of resiliency-based and Artificial Intelligence solutions for Smart Cities and Smart Regions.
|
|
|
|
User-centered Heterogeneous Data Fusion for Multi-networked City Mobility UHDNetCity
|
Both sentient and sensual, the smart city is based on the identification of millions of occurrences around the mobility from evaluating residents consumption of water and energy, to recording traffic on road network, to communication network congestion on hot spots, to measure air quality, and tracking public opinions on social networks. This project aims to characterize urban mobility in smart cities, which are interconnected and interdependent sociotechnical systems. Our objectives are:
- Create a data-fusion tool for integrating heterogeneous data from various urban infrastructure and social media.
- Enhance citizen participation in urban mobility characterization using a smart phone app such as DigiTally in Tallahassee.
- Define new indices to measure mobility as a multidimensional spatiotemporal entity in the city ecosystem.
|
|
|
- Authors
Predictive modeling is a process of using statistical and machine learning algorithms to analyze historical data and make predictions about future events.
This can be used in a variety of fields, such as finance, marketing, and healthcare, to predict future outcomes based on past patterns and trends.
Predictive modeling typically involves several steps:
- Data collection: Collecting and preparing historical data that will be used for the model.
- Feature selection: Identifying the important variables or "features" that will be used to make predictions.
- Model selection: Choosing an appropriate algorithm or model to analyze the data and make predictions.
- Training: Using the historical data to "train" the model, so it can learn the patterns and relationships in the data.
- Validation: Testing the model on a separate dataset to ensure its predictions are accurate.
- Deployment: Using the model to make predictions on new data.
Predictive modeling can be used to predict a wide range of outcomes, such as customer churn, fraud detection, stock prices, crop yields, and more. The goal is to use historical data to identify patterns and make accurate predictions about future events, allowing organizations to make data-driven decisions and improve their operations.
However, the accuracy of the predictions will depend on the quality and completeness of the data, the chosen algorithms and parameters, and the level of complexity of the problem. Also, it is important to note that predictive modeling is not a crystal ball and can't predict future events with 100% accuracy.