Ever since organisations started taking advantage of insights derived from Big Data, data scientists concentrated their efforts on the ability to make correct assumptions about the future. A few years later, with the help of automation, developments in machine learning (ML) and advancements in the application of artificial intelligence (AI), the ability to make predictions is no longer wishful thinking. We now live in an era of predictive analytics.
Predictive analytics incorporates both ML and AI to make predictions about unknown future events by analysing rich historical data. Although the technology applies to almost any industry, in transportation - where travels patterns remain reasonably regular - its uses can be particularly enlightening. It can empower cities to become more efficient by optimising network performance, it can help transit agencies provide better customer experience by preventing service downtime, and it can ensure smooth throughput of main city arteries, predicting potential areas of congestion long before the first car comes to a stop in traffic.
But how does the technology actually work? How are the predictions made and what conditions must be met to ensure maximum accuracy and reliability? And finally – are our cities ready to reap the benefits of predictive analytics?
Although predictive analytics can apply to various areas of the transportation industry, we’ll focus on two key categories: predictive analytics for services (predictive maintenance) and predictive analytics for operations. The former helps transit agencies pre-empt equipment failures and improve customer satisfaction, while the latter helps cities deliver more robust solutions for traffic management with confidence and at a reduced cost.
In the predictive maintenance model, the source for the modelling and forecasting typically comes from historical equipment data and external variables, such as passenger throughput and meteorological conditions. Equipment data can cover the history of all assets, down to the component level of each machine (let’s say a ticket vending machine), including elements such as the smart card reader, coin vault, bill handling unit, and more.
This information, alongside other historical data such as repair call logs, feeds into the ML-part of the predictive analytics engine. The engine assimilates the data, creates a historical understanding of each asset’s behaviour and performance over time, looks for patterns and correlations, and then, with the help of a dedicated algorithm, builds a predictive base model of how the asset is likely to behave in the future. For the base model to be deemed accurate, it must make correct predictions at least eight out of 10 times. Importantly, the predictive tool creates projections on the residual life of an asset – not just the next issue, allowing transit agencies to take proactive action ahead of equipment failure. This information feeds into the life cycle management for the devices and enables agencies to improve overall asset maintenance.
It takes more than simply flicking the ‘ON’ button for predictive analytics to work effectively. For instance, the reliability of the predictive model often depends on the quantity and quality of data. Unfortunately, not all agencies can offer a robust data source. Even when an agency does monitor and collect information about its assets, data can become corrupted or lost, information might not be collected consistently, or might be insufficient for predictive analytics. Equally, not all equipment can collect, store and transmit large amounts of real-time data into the central system.
One of the vital learning lessons from
To further ensure the accuracy of the predictive engine, a transit agency can rely on a combination of predictive technology and human expertise. This approach is particularly important in the early stages of predictive maintenance. When the model flags a particular piece of equipment as likely to fail, a subject matter expert deployed to fix the issue can confirm the prediction by, for instance, reporting significant wear and tear, indicating failure was imminent. Such validation is not only helpful for optimising the predictive algorithm, but it also provides a critical feedback loop for continually improving the quality of predictions.
Earlier this year, the government of New South Wales, Australia’s most populous state, pledged to spend millions of dollars on enhancing the monitoring and management of the road network across the region. As part of the initiative, it tasked Cubic with delivering an intelligent congestion management programme – an example of predictive analytics for operations. This data-driven transport management platform enables cities to predict traffic patterns, reduce congestion, improve major event planning and response to incidents on the transport network. When ready by the end of 2020, it will make Sydney the first city in the world to manage its transportation network based on a predictive analytics model.
However, projecting traffic patterns and incidents for an entire transportation network in a city as large as Sydney is a tough ask that involves several moving parts. The prediction engine must assemble and synthesise data from multiple input points throughout the city. These include pedestrians, private cars, public transit vehicles, third-party transportation services (such as ride-hailing services, scooters and micro-transit), as well as an entire host of city infrastructure endpoints – traffic cameras, traffic and street lights, parking, bus stops, railway stations – the list can go on. On top of that, the system must be smart enough to incorporate variable data that impacts the network, such as weather and seasonality (a Black Friday shopping event, a local football game, a holiday).
Assembling all the information is not always possible. Although the notion of ‘smart cities’ has captivated urban planners and city agencies alike, not many metropolises can yet claim the title. With significant gaps in smart city infrastructure, lack of common language for information sharing between various city systems, and inadequate network coverage, the pace of innovation in urban areas often falls short of an environment ideal for making predictions. In such situations, the predictive engine needs to not only analyse, understand and mimic the actual transportation system but, inevitably, fill in information gaps with AI-based simulations. As cities upgrade their smart infrastructure and agencies invest in connectivity, they can fill in the gaps in information, leading to better accuracy of predictions.
Thankfully, despite all the complexity, predictive models are easily scalable. Although the initial investment of time and resource to build the base model is significant, once the legwork has been done, models can then be applied to other cities, with the possibility of improving their performance over time. If, for instance, the initial accuracy of a base model in a new city is 70%, the predictive algorithm can be adjusted with new data sources to yield better results.
Confirming the accuracy of predictions in the operations model is less straightforward than in the predictive maintenance model since predictions are made on the basis of complex and dynamic algorithms that constantly adjust and re-evaluate information. It is, nevertheless, possible through an analysis of AI simulations of various traffic events and benchmarking against similar historical scenarios. It’s important to keep in mind that responding to a traffic event in one part of the network - e.g. redirecting drivers to alternative routes due to a broken down vehicle - may inadvertently affect other parts. Therefore, improving the overall efficiency of the network over achieving singular gains around individual traffic events must always be front-of-mind when assessing the effectiveness of the predictive analytics model.
Although Cubic’s tagline for Sydney’s transport management platform is ‘Predict 30 minutes into the future, act in 5’, long-term predictions are a matter of time. With access to the right information, a good understanding of the network, and the imminent arrival of 5G, cities won’t have to wait long to predict an hour, 12 hours or a day into the future.
With time, we can reasonably expect cities’ predictive abilities to grow exponentially, paving the way for automated city networks where traffic lights instantly recognise and give priority to emergency vehicles, dynamic lanes can accommodate changing traffic conditions, drivers’ phones alert them to road obstructions and automatically redirect to an alternative route and autonomous cars never get stuck in traffic.
While the future is undoubtedly impressive, it’s important to keep in mind that predictive analytics cannot operate in a vacuum. Without the appropriate regulatory environment, the coming together of different stakeholders, and the overall investments in city infrastructure, even the most advanced analytics technology will fail to make a difference. By understanding the technicalities and practical problems faced by cities that have invested in the technology, transit agencies, city authorities and tech companies alike can ensure cities are adequately prepared to make the most of predictive analytics, today and well into the future.