Make your data fit for purpose
Making your spatial data fit for its intended use is central to effective data stewardship.
But, first you need to be clear about what “fit” means in your specific context.
Our approach is designed to discover and precisely detail your requirements, to check how your data conforms to those requirements, and then to cost-effectively bring your data up to standard.
“1Spatial has a good understanding of what’s required and over the last couple of years we’ve developed a very collaborative relationship. It’s helped us massively. Not only does it save us a lot of time and money, but it means we end up with the best solution for our needs.”
Corporal Richard Jennings | No 1 Aeronautical Information Documents Unit
No 1 AIDU feeds forces with accurate aeronautical information
1Spatial has enabled No 1 AIDU to improve the quality and consistency of their data by maintaining it in a single, central database, rather than numerous individual charts. Alongside the increased productivity, No 1 AIDU is also benefiting from substantial time and cost savings.Read more
What do we mean by data quality?
Data quality means ensuring your data is fit for its intended use.
No data is a perfect reflection of the real world so organisations typically decide what level of quality is acceptable. This could mean defining which rules are mandatory or optional and the acceptable conformance levels for each rule.
The key is understanding what you need.
Working with customers, we often help them to determine exact business requirements to allow them to define quantitative quality metrics.
The 1Spatial approach
We can work with you to suit your operational needs and requirements. We can provide our solutions and expertise as a service approach or alternatively you can use our tools yourselves. Either way we start with a detailed Data Quality Assessment. This includes a data discovery process that defines precisely your data requirements.
The discovery process is followed by Data Conformance Checking that assesses your existing data against your requirements to provide a detailed, baseline analysis of your data and any gaps.
The Data Quality Assessment can then be repeated in an ongoing basis as a persistent Data Quality Management process in which our automated, rules-based tools can cleanse your data, maintaining it continuously up to standard.
Performing a Data Quality Assessment and addressing any issues raised, are traditionally labour-intensive, time-consuming manual processes.
Our automated, rules-based approach is different:
- Cost-effective and efficient – Data rules apply the best judgement of your experts, consistently, automatically and on demand. Our solution can quickly find and fix the majority of data issues, identifying any remaining complex issues for the attention of your experts.
- Collaborative – We work with the users and creators of your data to agree your real requirements. Our user-defined and user-managed rules make data quality an explicit process that can be discussed and adjusted with business needs.
- Consistent – Once defined, data rules can be run at any time, across your organisation to produce a consistent measurement of quality and to apply repair and cleansing rules in a consistent, repeatable way.
- Thorough – We assess accuracy, consistency, correctness, currency and completeness of your data against requirements. And, our rules ensure the same, thorough assessment can be repeated as required at any time, on any data-set, anywhere in your organisation.
- Enterprise-wide and technology neutral – Using a single, central repository of user-managed rules means you can run the same rules across all of your data sets, regardless of technology platform or format. It also means avoiding the risk of being locked into ageing technology by expensive bespoke programming.
For help getting your data into shape and keeping it that way, please contact us.