tl;dr

The speaker, Jaymie Croucher, discusses the Surface Intelligent Transport Systems (SITS) program at Transport for London (TfL). TfL manages various modes of travel and a significant road network in London. They use geospatial data to tackle challenges in managing traffic demand, road network changes, and network properties.

The SITS program aims to unify and improve traffic systems, with geospatial data forming the foundation for a digital twin of operations. The Common Operational Road Network (CORN) reduces raw data, enabling the analysis of vehicle movements, scheme performance, and 3D visualizations.

Future steps involve incorporating micromobility, enhancing safety for pedestrians and cyclists, and utilizing telematics data. The speaker shares advice for embarking on a similar geospatial digital twin journey, emphasizing realistic accuracy expectations, consensus on goals, interoperability, and self-assessment of data readiness.


Transcript

My name is Jaymie Croucher. I am the GIS lead within our TfL operations department, within network management resilience, which is a mouthful all unto itself. I’m going to spend the next 15-20 minutes just going through our SITS programme and how geospatial has been put at the very centre of that to help deliver our operations Digital Twin and I think I just want to start initially as all good things should, by recognising my colleagues and the fact that I'm extremely lucky to be here to present their work and also to outline from the off to stop any unnecessary awkward questions at the end. I'm no self-confessed expert in digital twins at all, but merely just outlining how we've used geospatial to answer some of the queries that are related around delivering them. If you're interested in that, please do reach out on the back of this, I’d love to have that conversation. Excellent.

So who's TfL? So TfL's normally synonymous with London Underground and while that's a spectacle in its own right, we also have lots of different modes of travel, as you can see. Please use them on the way home. But we also manage around 580 kilometres of our road network and that accounts for around 5% of the roads within London, but it carries around 30% of the traffic volume. So that just goes to show you the size and scale of the road network that we monitor.

But not only that, we also work with our borough partners and National Highways to help understand and inform their roads as well, to help improve them. And we do this because we've got an enormous amount of sensors out on our network, and we also control the 6400 traffic signals within the capital as well.And what that does is it allows us to implement control measures so that we can manage flow across London's roads. But within my directorate of Network Management and Resilience, we manage the road network 24/7, 365 days a year.

We have an enormous portfolio of surface assets, including 21,500 bus stops and 25,000 trees. We help coordinate over 50,000 road works annually and working with suppliers to mitigate the impact they have on the road network. And then lastly, what we are most focused on is in driving the improvement, an increase of the 3.7 billion journeys made every single year by bike, walking and by bus. And all of this generates an enormous amount of data as you can probably appreciate, and I think TFL generally is perceived as being very data rich. And actually to a large extent we're very good at structuring our data and that's because we've had an open data offering since 2007 and that's allowed us to listen to customers about what they need and developers and structure our data accordingly.

But although we're very good at structuring our data, I don't think we're very good at standardising it, especially geospatially. And what I mean by that is how one team perceive bus speeds on a particular road versus how another team might count the number of cyclists can drastically change depending on the lens that they view the real world in, in a geospatial context. So it's really important to actually mitigate these problems to ensure that we're viewing and reporting on the world in a consistent manner because we also communicate with them to the public.

But the one benefit that we've got to all of this is that geography and time link all of this data. So we've got the ability to represent it visually. And that brings me nicely to the challenge really and, pertinent with the BBC being around the corner. But if you look at the headlines or if you type into Google now, what's the most congested city in the world? I won't stand here and be proud about it, but TfL sorry, London will feature somewhere in the top lists, at least for the last two years, and I won't get into the politics of what constitutes congestion or how you measure it. But it suggests, at least on the surface, that TfL isn't particularly good at doing its job, which I'll leave up for debate. But with headlines they’re always normally quite polarising and this is absolutely no different because as you saw from the last slide, TfL was responsible for not only managing one mode, we're responsible for managing competing demand.

And since 2018 we've been helping deliver the mayor’s transport strategy using the Healthy Streets Framework, which you can see on screen. And this breaks down the human experience on the street into ten factors with an ultimate aim of improving London to make it a cleaner and greener city. So if you imagine a typical London street TfL’s job isn't just to manage one demand, it's to manage the competing demands and keep the equilibrium across all of them whilst helping to grow more sustainable travel. And therein lies the real challenge.

But not only that, we also have a challenge of understanding our road network. So whether that be the data led challenge in poor quality data, geospatial data has transformed itself over the last 15-20 years. But if we want to look historically at how our roads have changed, that's really difficult because the geospatial data has changed fundamentally. We also see change in our network attributes as well. So geospatially the road network changes around 5-6% within the London area and that makes it really difficult to catalogue and understand that change over time as well.

And then lastly, we've got network properties and some of these are unique to London in the sense it's got one of the most advanced and complicated road networks in the world, but its stochastic properties of driver behaviour make that really difficult to understand and interpret demand and enact accordingly as well. And if we just take one of those challenges in the data led element, since I've been at TfL for the last four years, I've seen this and I'm sure there's people in the audience that have this as well. There's lots of different ways of viewing the world, whether that be via OpenStreetMap data, National Street Gazetteer Ordnance Survey, INRIX data, for example. There's lots of different ways of viewing the world. And depending on how you view the world, the attribution, the length, the definition can change substantially or sometimes in very minor ways. But it's really important that we standardise the way we view the world because it's really important on how we report, as I've already mentioned. And that brings me nicely around to SITS..

So traditionally we and since TfL’s formation in 2001 we've tackled these problems in isolation and then we've built up a traffic system accordingly that's then grown organically within the organisation, and then people feel fundamentally that's part of their role to use that particular piece of software. And that won't be unique. I'm sure there’s many people in this audience that have that same problem about users not wanting to move away from a certain version or whatever it might be. But in 2016, TfL undertook a digital transformation journey known as the Surface Intelligent Transport Systems programme. And what this does is it looks to unify all of these disparate traffic systems, overhaul them and improve them to transform the way that we understand the road network with an ultimate aim of saving around a billion pounds in customer benefit over its lifetime.

And you can see it's broadly split inbetween two areas. We've got our applications here at the top above the integration service, which will involve our incident management, how we manage incidents, our predictive capability to understand and mitigate changes and then our real time optimiser. But underneath all of that as a lot of you are probably quite excited by, is the data tier, and how all of that gets fed in. And it's really important because what gets fed in and out of these systems is only going to be as good as what comes out of these systems is only going to be as good as what gets fed in. So what is a digital twin? So it's fantastic to see the last presentation about what that definition is.

I'm going to pit two 1Spatial colleagues against each other with Seb Lessware’s definition from last years Smarter Data, Smarter World, because I think it's really important to understand the difference between the two, particularly within my domain of GIS, and a lot of people in the rooms’. It’s traditionally been sort of dressed up as a 3D visualisation. It's not necessarily how we view it or perceive it within the context of SITS. We view it in essence as a mirror of the real world. Not only that we can interact and work with, but also undertake change in real time. That's the real benefit of a digital twin. So we've got a digital transformation journey that's the hard bit, get the funding. We've got all of this fantastic data behind us. The digital twin would seem the natural way to go then, right? So how can we apply it?

Want to discuss how a "data first" approach can enhance your organisation's performance at scale?

At 1Spatial, we really focus on a rules based approach to get data right.

Speak to an expert

So if we look at some of the business challenges, we can see some that are quite unique to transport others, such as disparate data sets and real time data are actually more common across lots of different modes. Sorry, lots of different sectors, but we can see that how you can respond to those can actually all be delivered by a digital twin potentially. But where does geospatial, where does GIS come into this?

We can see that GIS has the ability to underpin all of this data as I mentioned earlier, we can link all this data by space and time, but if we can align that to a common framework, we can be really clear cut in our insights about what we're delivering and really confident in our data. I'm sure as people sat there and sat in the seats now thinking, like I say, like we've not been particularly confident in quoting a particular figure or a number, but it's the best you've got available to you. This is really starting from a foundation and growing that.

So where does that manifest itself within SITS? Well, if you see we've got all of our data and this is going back to the initial discussion at the start of today, we've got all of our video analytics sensors, traffic systems, whatever it might be, and it's taking the data first approach to make sure that all of those are not only captured in a correct frame of mind, but also ensuring that they align to a common framework or a common geography, as we call it. And what that allows us to do is further downstream as it goes through the applications of SITS and then subsequently into the business functions. We’re really confident in what we're doing and delivering.

So that leads us around to if you, once we'd had that scoped out, we have the ability to build a number of principles and standards around that. And the first is to ensure that the whole business was along for the ride. Now TfL’s an organisation of around 30,000 people, which is quite a substantial size and the challenge for the number of people that have different views of the road network is ensuring that all of their requirements of what they need to use a road network for are captured.

The second’s around the ability to represent traversable directions of traffic. We have to understand not only for vehicles but also for cyclists and pedestrians the different ways that they interact and use the network. We also need to ensure that the right geographic footprint was used as well. So whenever people look at a map of London, typically it's to the Greater London Authority boundary. But actually an example of where we need data beyond that is our bus routes. That extends to Dartford, for example, where you've got sort of three miles outside the Greater London Authority boundary. So we need to ensure that that was well covered.

We also need to ensure that we had the right level of aggregation. And what I mean by this is it needs to be both computationally and cognitively efficient. And I'm sure there's lots of you that will have had a map open and it's got a million data points in there and it's really hard to perceive what the ultimate aim or the what that maps trying to communicate to you. We need to ensure that as we're aggregating our data, that it's at the right level that not only an end user can infer from, but also the system can do that as well. And that's what makes a digital twin really powerful.

We also need to ensure that we have a methodology in place for change management in terms of data turnover. So as I mentioned, that 5-6% turnover year on year, we need to make sure we have a process in place that not only deprecating but also promoting links and a historic look-up post that as well. And then finally, a network that's supported in your referencing, and that's for things like bus stops. That change at such a frequency it wasn't conducive to include them within this particular frame.

So once we’d outlined the principles we wanted to cover, that led us to a three phase process and the first of which was to align our data to the lowest granularity. So what is the lowest granularity we want to look at our data through and we opted for what was at the time the recently released Ordnance Survey Master Map Road Link Product, because a lot of our data was already aligned to the ITN already. But not only that, it allowed us to glean the benefit from the PSGA and the look ups that already exist to things like the National Street Gazetteer. We then need to aggregate that data, as I mentioned, to be both computationally and cognitively efficient. And there's a very long document that outline this that was put together by the consultant that worked on this. But I haven't brought along today but if you are interested, I’ll outline what those rules are specific to what we've used them for.

And then lastly, it was embed our operational data and linear referencing onto the model so that we've got the ability to look at all of our assets in the frame around this common framework and make comparisons like for like. The result of which is the Common Operational Road Network called the CORN. And I, before anyone asks I had absolutely no say in naming that particular product, but this is the end output and it takes our raw lowest granularity, our OS highways Master Map product of just under 600,000 links and half a million nodes with no operational data or traversable directionality or node split and it reduces it by around 89% for links and 93% for nodes. And before anyone calls me out and just says “Well you've just deleted a load of links in this one and you spent the last two or three years doing nothing.” As I mentioned, TfL is an entity that's only really interested in a specific category of roads whether that be our bus routes or the strategic road network, for example. But we've also got our operational attributes aligned to it and we'll go into those in a little bit more detail on the next slide.

But we've also split it down by traversable mode and mode split as well. So we know what type of mode is going to infer to use a particular road link. And the end result for all of our attributes is that some of which we get from our operational data, but actually some, as I mentioned, like speed limits we get from the Ordnance Survey, we get things like road whips, for example as well. But the real power of the common framework or the common geography sorry, is that actually we insist in its embedding within commercial process as well. So when we're asking our suppliers to align data to this framework, we're insisting that it's done to the lowest granularity. So for example, we get data from INRIX, which is aligned to the OS Highways product and allows us to then downstream infer what's going to come down, sorry, what's going to be coming from that particular API. And the reason that's really important is because as I'm sure many of you have done before, when you're doing on the fly comparisons, it means things are very slow computationally and working with data at scale that's one of the chief challenges.

So how are we using it? So CORN is now in its 15th iteration and we're now using it to actively answer some of the queries that business has. So the first is around air quality, and this is obviously very contentious with the recently enlarged ULES expansion. But we use the our Digital Twin using our common operational road network to look at second by second vehicle movements at junctions, so we can anticipate how performant they are in terms of air quality by measuring the emissions at tailpipe. So it's really smart in what it's doing straight from the off.

We also use it as a situational awareness product. And what this allows us to do is create time cuts for reporting on key performance metrics such as congestion. But it also allows us to measure how successful schemes are both pre and post of their delivery. And that allows us to understand that we are doing the right thing. Out on the street things like low traffic neighbourhoods, for example, that we might be supporting a borough with, but also we have strategic level schemes such as the Silvertown Tunnel, which we've been monitoring for the last two years and when it delivers in 2025, we'll continue to monitor its progress and its success or failure into the future about how it's performing.

And then finally, we also use it in a 3D construct as well. And I mentioned earlier about digital twins being dressed up as a 3D visualisation. The difference with our surface live products is that it's ingesting real time speed information and vehicle flows as well. And it allows us to see it in that confine and this continues to grow our digital view of the world.

So what's the next steps? So the first one is around dealing with other modes. So Micromobility has really come on board, particularly within London in the last couple of years, understanding where people start and end their journeys and where that traverses on our network, will ultimately impact on what we are interested in wanting to cover. So the growth of other modes is really at the top of the radar of what we're looking to try and achieve and add in to our model. Also, wanting to understand at the more granular level around junction behaviour around what pedestrians are doing and cyclists as well so we can make elements more safe as well for members of the public. And then lastly, it's the growth of telematics. So we find that we've got an absolute enormous queue of people wanting to work with TfL that have telematics data, freight companies for example. It's about how we understand and utilise that data whilst also respecting the benefits that are already in play in terms of performance.

And then the last thing I want to do is just finish on a few thoughts for anyone that might be sat there thinking, Yeah, this is great, but I'd love to undertake our own digital transformation and build a foundation in geospatial for a digital twin. How would I do that? So the first is just to set a realistic expectation of accuracy over effort. And we've seen it time and again. I was really amazed by the figures in the last presentation, that level of accuracy. It was fantastic to see and we started off all glass eyed around when to deliver this product once a week and were brought swiftly back down to earth. Once we’d started the process of building that out, we now do an annual update with free interim updates as well every quarter. And the reason we do that is because we build in quality to do that so everyone can be really confident in the product they are using.

The second is I mentioned earlier about bringing everyone along on the journey with you. That is really important, but it's also important to understand that the the common goal of what you're looking to achieve also needs to be respected. So keep that in the forefront of your mind and don't be delineated about someone's specialist expectations around something. If it can't come on board, ask yourself does it absolutely have to?

And then the third point I think was mentioned this morning around Unified digital twins or digital cities, really exciting to hear Ordnance Survey’s development with that. But it's really key to outline from the off that there used to be a really common standard around framing for geospatial foundations, for digital twins. And understanding that if we are going to do these communications between digital twins, they need to be competitive.

And then the last one is just an honest reflection. TfL has been on this journey since before its formation in 2001 with how and where its data is accessible. We've got a really strong portfolio of data to call upon, ask yourself where you are in your journey, if that's of interest to you as well.

Thanks very much for listening and any questions?

Speak to an expert