Geospatial is not only about where, but also when
As geospatial technology advances and becomes critical to almost every element of the world around us, we are saturated with geospatial data everywhere. It’s phenomenal how accurate data is obtained from governing bodies, environmental agencies and even transportation agencies.
For the geospatial specialist, they can open their GIS or spatial software and access thousands or millions of data that can be used to model, analyze or visualize.
Almost these data are snapshots from now or from the last known event, the real world is always changing, it is not a good practice to compare last year’s conservation data with this year’s aerial images or decade-old forestry data.
Still, it happens all the time, not that it’s a huge problem if the real world hasn’t changed too much. After all, if accuracy is important to a project, this can be captured through a survey. The problem doesn’t stop there.
It is important to take into account the data that came before. How fast is the city growing and is it likely to affect the new area under consideration? Where does the coastline disappear? Problems that can only be solved by using the data as it changes.
Geospatial data standards do a great job of ensuring that the correct names and terms are used. ISO 19115 even has a few lines about temporal resolution in the metadata, but how many data providers also provide their historical data? Furthermore, who is responsible for ensuring that this historical data exists?
When we look at almost all of the national data, which is openly provided to support the millions of companies that use geospatial information to run their projects, there are no links or references to availability to historical or past data. This includes providers such as Natural England, Defra, Historic England and Ordnance Survey.
History
The irony is that weather data, natural hazard data, earth observation and tide data are readily available as monthly and annual data for periods of more than a decade or more.
Some data is more detailed than others, but essentially modeled so that valuable information can be extracted, such as regression analysis and AI, that can be run to predict what might happen in the near or distant future.
This temporary geospatial data is now more valuable than ever. There is increasing pressure to provide affordable and green housing and to ensure that environmental targets are met.
This can only be done by understanding current issues and then using how the information has changed over time to understand where future changes may occur and how to influence them.
It applies not only to housing, but also to the energy sector, by understanding how community acceptance, weather and land availability can change. The transport sector, by forecasting urban sprawl, needs new stations due to change in working habits.
Even the leisure industry, through the use of environmental, historical, customer habits and transportation over time. This can go on endlessly, for almost any industry.
What needs to change
In an ideal world, geospatial policies would change to ensure that geospatial data providers are responsible for maintaining historical snapshots of their data, making them available, and providing current data. Of course this is supported with metadata.
There are many legal reasons for capturing the final source data in snapshots, but the most compelling is when a project is being built and an accident or complete failure occurs years later and the information needs to be analysed.
When this happens, the data used must be verified to be the best data at the time or that nothing was missed. Without the definitive historical data it is impossible to verify.
Currently there are many halted renewable projects in the UK due to a change in government policy in 2016. Some of these projects have already been approved but validation of the information used in the project is required by multiple interested parties but they do not have access to the available information. data at that time.
As more geospatial data is used to secure and validate projects it’s more important to someday have a way to do due diligence on projects one just has to look at the growing housing market new infrastructure requirements to accommodate the growing population support or the military services faced with ever-changing threats.
At this time, there is no mandate or requirement from data providers to maintain or provide historical data. Users are only allowed to use historical data for many of the providers by deploying it to their own systems by using timestamp data or archiving old versions and keeping the latest which works fine but from when you first deployed it.
Another problem here is that there is also little or no way to “top up” this data, there is no central repository or help desk that can be called to provide ten years worth of data. Partly because of the size of the data and the intractability of trying to share it, but also because of the time that may have to be invested in recovering the data.
What now?
The ISO 27001 corporate certification covers data retention and storage, which extends to geospatial in that data archives are archived and stored for a specified time (defined by the policy).
This means that data is recorded historically, but not in a usable way. Ideally, a change-only approach should be implemented, where new records are inserted with timestamps and deleted records are also timestamped (and an identifier to indicate whether it was inserted or deleted).
By using a change only update (COU) method, the historical data can be recorded in the most efficient way, but can also be read in geospatial software.
For more than a decade, GIS such as Esri ArcMap and QGIS have had time viewers (called time managers) that allow viewing data over time by reading a mapped time field, even exporting videos showing the change over time. show the time.
Web mapping can also display timed data, platforms such as CesiumJS have time sliders out of the box to allow viewing of data over time.
As we move into AGI (Artificial General Intelligence) and there is a need to train and inform AI models, there will be an increasing need to leverage more historical data to truly understand trajectories of growth and change.
In AI and ML, this data is starting to move away from GIS and is called “spatiotemporal data”, although in reality it is still geospatial with an element of time.
My prediction here is that there will have to be a change in the way government agencies provide data and this could lead to a new data format. Just as we embraced Z (altitude) enabled data, where the Z was not a field but a reference, we will need to embed time and changes into the data to not only remove the data bloat, but also ensure that time zone references are stored and linked to the coordinate systems.
Can we see Z, M and T enabled data? Hope so, this will help in better BIM integrations and planning projects.
The motto of GIS is everything happens somewhere, I suggest we mature now to the next evolution of GIS and change the motto to everything happens somewhere sometime.
Disclaimer: The opinions expressed are those of the author. Geospatial World may or may not endorse it