Index of Resources:
[Procedural synopsis: National Committee for Digital Cartographic Data Standards, operated by the American Congress on Surveying and Mapping, met from 1982-1988 to produce a draft proposed standard. US Geological Survey took this result, worked with Federal Interagency Coordinating Committee for Digital Cartography, and later the Federal Geographic Data Committee to turn this draft into the Spatial Data Transfer Standard adopted as FIPSPUB 173 in 1991. Following another period of review, and changes (mostly to the raster modules), SDTS was adopted by ANSI (the American National Standards Institute) as ANSI NCITS 320-1998, June 9, 1998.While the format specification changed, the data quality language remained unchanged from the 1985 draft.
Text of Data Quality Specification
Pieces of the SDTS Data Quality Specification in context; training
materials on SDTS include a short
version of Data Quality (page 14, #20 in .pdf)
The standards world is VAST.
Data Quality in DIGEST (NIMA/ NATO; Now a .org!)
Example of Data Quality (in .pdf)
for Digital
Elevation Data (USGS)
FGDC Standard for Geospatial Positioning Accuracy (adopted 1998)
Geodetic standard
(and surveying standards in general) use redundant measurement
(closure of traverse; residual from least squares fit); for geodetic
network, there is no external standard of higher accuracy.
ASPRS standards for 'large-scale line maps': independent sources
for well-defined points, tabulate distance
Defense Mapping Agency Mapping,
Charting and Geodesy Standards, actual
text
Procedures for testing 'ill-defined points' (meaning lines) not
well-established
Continuous data treated as position (differences as distances)
see DEM
standard at USGS
Categories: random sample of points produces misclassification
matrix
Alternative: exhaustive overlay of two sources (gives positional
as well as attribute information)
(produces matrix with areas, not count of points; also called
confusion matrices: for example Old
Growth Mapping tests)
Topological verification is a feature of most packages, other checks are ad hoc.
Test coverage (set of objects) against a list of all objects
(gaps in both directions)
Test mapping rules: minimum width, minimum area (minimum mapping
unit)
Text of QC agenda:
8.14 15046-14 Quality evaluation procedures
This part establishes a framework of quality evaluation procedures for a dataset of geo-spatial data so that data producers can define how well their products meet their product specification and users can define their requirements and how well they are met. The product specification or user requirements should allow the acceptable quality levels to be determined for each quality metric. An estimation of a dataset's quality is made by sampling, computer processing and/or indirectly by deduction for comparison with the acceptable quality level. This allows reporting on quality evaluation results.
Core Concepts document provided their version of map error; This was a project started (and ended) four years ago. The page is gone, like the project...
"Context errors" - relationships that should appear,
given other information shown or not shown
Any organization of errors (from SDTS Data Quality or from Microsoft's
taxonomy) ends up relabelling the same basic elements.
Users/evaluators send in digital records; contractors/suppliers
will be notified.
Microsoft is moving from the old days (when maps were expected
to be a professional responsibility) to an era when the consumer
will play a much more active role.
From here: Back to Lecture 17 | Class Resources | Lectures | Exercises and Discussions | How to contact us
Version of 6 November 2002