The Little Book of Spatial Data Quality
In our free Little Book of Spatial Data Quality, we look at how ensuring data quality is critical and how organisations are beginning to treat this as an ongoing process by deploying solutions that automate their data quality and data management procedures.Download
Make your data fit for purpose
Making your spatial data fit for its intended use is central to effective data stewardship.
To achieve this you first need to be clear about what “fit for purpose” means in your specific context.
Our approach is designed to help you discover and precisely define your data quality requirements, to check how your data conforms to those requirements, and then to cost-effectively bring your data up to standard.
What do we mean by data quality?
Data quality means ensuring your data is fit for its intended use.
No data is a perfect reflection of the real world so organizations typically decide what level of quality is acceptable. This could mean defining which rules are mandatory or optional and the acceptable conformance levels for each rule.
The key is understanding what you need.
Working with clients, we often help them to determine exact business requirements to allow them to define quantitative quality metrics.
The 1Spatial approach
We can work with you to suit your operational needs and requirements. We can provide our solutions and expertise as a service approach or alternatively you can use our tools yourselves. Either way we start with a detailed Data Quality Assessment. This includes a data discovery process that defines precisely your data requirements.
The discovery process is followed by Data Conformance Checking that assesses your existing data against your requirements to provide a detailed, baseline analysis of your data and any gaps.
The Data Quality Assessment can then be repeated in an ongoing basis as a persistent Data Quality Management process in which our automated, rules-based tools can cleanse your data, maintaining it continuously up to standard.
Performing a Data Quality Assessment and addressing any issues raised, are traditionally labour-intensive, time-consuming manual processes.
Our automated, rules-based approach is different:
- Cost-effective and efficient – Data rules apply the best judgement of your experts, consistently, automatically and on-demand. Our solution can quickly find and fix the majority of data issues, identifying any remaining complex issues for the attention of your experts.
- Collaborative – We work with the users and creators of your data to agree your real requirements. Our user-defined and user-managed rules make data quality an explicit process that can be discussed and adjusted with business needs.
- Consistent – Once defined, data rules can be run at any time, across your organization to produce a consistent measurement of quality and to apply repair and cleansing rules in a consistent, repeatable way.
- Thorough – We assess accuracy, consistency, correctness, currency and completeness of your data against requirements. And, our rules ensure the same, thorough assessment can be repeated as required at any time, on any data-set, anywhere in your organization.
- Enterprise-wide and technology-neutral – Using a single, central repository of user-managed rules means you can run the same rules across all of your data sets, regardless of technology platform or format. It also means avoiding the risk of being locked into ageing technology by expensive bespoke programming.