"Digital Twin" is term used all the time now as it can apply in so many situations. The Wikipedia definition of one as "a digital model of a physical product, system, or process that serves as a digital counterpart for simulation, integration, testing, monitoring, and maintenance" sums it up but that could apply to using ANY piece of digital data for decisions. A railway timetable for example, or the alert in my car telling me the mileage means it's time for a service.

'Up-to-date-ness' (aka 'currency), is what distinguishes a digital twin from just some data and that's the key here. A digital twin represents how something (typically a physical asset) is performing 'right now'. Without that up-to-date information, the system is not a digital twin but a digital birth certificate. The asset is not performing as it was when it was first made and so it's unrealistic to assume it hasn't changed. 

Traditionally this up-to-date view is achieved by adding live information from sensors to the data. This approach was initially developed for NASA missions or F1 racing where decisions need to be made quickly: not based on ‘how should this thing we built perform in theory’ but ‘how is this thing we built actually performing after 50 laps of a race?’. This technique was then used by asset and infrastructure management to do the same for physical assets such as water networks or road networks to provide better analytics on how the network is actually performing right now, by measuring water flow, traffic flow or bridge vibrations.

These systems are what I call 'high frequency' digital twins. The measurements and updates happen rapidly, orders of magnitude faster than the lifetime of the asset being measured. But as the focus broadened from individual cars, assets and buildings to networks, cities and then countries, a digital twin approach helps to model all sorts of aspects of smart cities and society in general: How is the daily commute for everyone? Which parts of a city will be most affected by floods?, What will happen to the network at 11am on a sunny day when everyone has solar panels and it also charging electric cars?

These broader questions require not just an individual network's asset data but data from any other part of national  built or natural environment. National mapping agencies and central government departments are seeing their data being used not just for maps or local planning, but for national ‘digital twin’ simulations. Many of these use-cases work best when spatial data is available as clean, well structured, up-to-date 3D objects. Many national mapping agencies are therefore focussing on generating (and most importantly maintaining) this 3D national dataset of data in which each 3D data object can be considered a digital twin of the real-world object. Earth observation and LIDAR scans help to provide raw information to speed up this maintenance, but the raw data needs processing into structured object information.

If the data is kept constantly up to date then it can be considered a ‘low frequency' digital twin where the important nature of the change is the presence and size of the buildings - and so only need to be updated at a similar frequency to the lifecycle of those assets. If simulating flooding then you only need to know the presence and shape of the building.

Then when added with sensor data and other network information, the low frequency digital twin also contributes to a ‘high frequency’ digital twin with live sensor readings being used to track traffic and electricity demand and water levels, enabling analysis of the performance of the whole city or country at a high frequency and fidelity.

Many of the projects we do at 1Spatial are related to this type of data. Some projects explicitly use the term digital twin and some don’t: Denmark’s national mapping/data agency is building a 3D building dataset as a digital twin. And we are helping them with data quality measurement and improvement of these 3D buildings. Many initial use cases for national digital twins of buildings are related to property/land usage, valuation, and, hence taxation: The size of an building, the views from it, whether it is in shadow could all affect valuation, and it's usage as a commercial or residential property will affect taxes and therefore a 3D building dataset will support these calculations and help detect where incorrect or unfair property tax has been applied.

Thereafter, it is the change in those buildings that really delivers value ie. has that building increased its number of floors, has the footprint of the property increased through a building extension, has the usage between commercial and residential use changed. All these significantly affect valuation and application of correct tax rates as well as affecting simulations for floods or wind speed. At 1Spatial, we support customers on (1) detecting these changes and then (2) automating the impact of these changes on the downstream processes such as applying the correct rates.

Some projects don’t use the term digital twin though they have very similar outcomes. For the UK National Underground Asset Register (NUAR) 1Spatial tools allow asset owners to keep their data up-to-date within NUAR, and you could consider NUAR to be a digital twin. The initial use-case is safe digging, but the Government custodians of the project would like to use it for wider society benefits. The data specifying which assets live underground is a low-frequency digital twin. Combining that with sensor data about how they are performing, or levels of traffic, or weather data would make it a high-frequency digital twin. But in both cases the key part is that it is up to date data to support analysis and decision making.  We are also seeing increased requests from other Governments for 3D twins of Underground Assets as well as the same approach for Underwater Assets given the strategic importance of underwater cables.

In summary, digital twins can be based on both high frequency and low frequency data but what makes them valuable is that the data is constantly updated and what makes them viable is that the update process is as automated as possible.