Digital Classification and Mapping of Urban Tree Cover: City of Minneapolis Metadata

Full Metadata View Attributes View Sample Get Data


Digital Classification and Mapping of Urban Tree Cover: City of Minneapolis



This page last updated: 09/28/2014Metadata created using Minnesota Geographic Metadata Guidelines


Metadata Summary

Originator Remote Sensing and Geospatial Analysis Laboratory, University of Minnesota
Abstract Quickbrd multispectral imagery and LIDAR were used to classify the city of Minneapolis land into land cover
Browse Graphic View a sample of the data
Time Period of Content Date 2009
Currentness Reference QuickBird satellite imagery acquired on June 25, 2009

LiDAR imagery acquired in June 2007 was available from the U.S. Army Corps of Engineers

Access Constraints The Remote Sensing and Geospatial and Analysis Laboratory, University of Minnesota, has attempted to produce accurate maps, statistics and information of land cover and impervious surface area. However, it makes no representation or warranties, either expressed or implied, for the data accuracy, currency, suitability or reliability for any particular purpose. Although every effort has been made to ensure the accuracy of information, errors and conditions originating from the source data and processing may be present in the data supplied. Users are reminded that all geospatial maps and data are subject to errors in positional and thematic accuracy. The user accepts the data “as is” and assumes all risks associated with its use. The University of Minnesota and the Minnesota Pollution Control Agency assume no responsibility for actual or consequential damage incurred as a result of any user's reliance on the data. The data are the intellectual property of the University of Minnesota.The Remote Sensing and Geospatial and Analysis Laboratory, University of Minnesota, has attempted to produce accurate maps, statistics and information of land cover and impervious surface area. However, it makes no representation or warranties, either expressed or implied, for the data accuracy, currency, suitability or reliability for any particular purpose. Although every effort has been made to ensure the accuracy of information, errors and conditions originating from the source data and processing may be present in the data supplied. Users are reminded that all geospatial maps and data are subject to errors in positional and thematic accuracy. The user accepts the data “as is” and assumes all risks associated with its use. The University of Minnesota and the Minnesota Pollution Control Agency assume no responsibility for actual or consequential damage incurred as a result of any user's reliance on the data. The data are the intellectual property of the University of Minnesota.
Use Constraints This data may be used for educational and non-commercial purposes, provided proper attribution is given. Secondary distribution of the data is permitted, but not supported by the University of Minnesota. By accepting the data, the user agrees not to transmit this data or provide access to it or any part of it to another party unless the user includes with the data a copy of this disclaimer.This data may be used for educational and non-commercial purposes, provided proper attribution is given. Secondary distribution of the data is permitted, but not supported by the University of Minnesota. By accepting the data, the user agrees not to transmit this data or provide access to it or any part of it to another party unless the user includes with the data a copy of this disclaimer.
Distributor Organization Remote Sensing and Geospatial Analysis Lab, Univeristy of Minnesota
Ordering Instructions see website or contact infosee website or contact info
Online Linkage Click here to download data. (See Ordering Instructions above for details.) By clicking here, you agree to the notice in "Distribution Liability" in Section 6 of this metadata.


Full metadata for Digital Classification and Mapping of Urban Tree Cover: City of Minneapolis


Go to Section:
1. Identification_Information
2. Data_Quality_Information
3. Spatial_Data_Organization_Information
4. Spatial_Reference_Information
5. Entity_and_Attribute_Information
6. Distribution_Information
7. Metadata_Reference_Information


Section 1 Identification Information Top of page
Originator Remote Sensing and Geospatial Analysis Laboratory, University of Minnesota
Title Digital Classification and Mapping of Urban Tree Cover: City of Minneapolis
Abstract Quickbird Multispectral data have been used to classify and map impervious surface area for the area of Minneapolis, Minnesota, USA for 2009. Impervious area is mapped as a continuous variable from 0 to 100 percent for each 30-meter pixel.Quickbird Multispectral data have been used to classify and map impervious surface area for the area of Minneapolis, Minnesota, USA for 2009. Impervious area is mapped as a continuous variable from 0 to 100 percent for each 30-meter pixel.
Purpose The project objective was to generate a digital land cover classification of the City of Minneapolis in GIS-compatible format, with emphasis on mapping the tree cover that can be used by the City to evaluate existing tree cover and potential for additional plantings. Tree cover is defined as the leaves, branches and stems covering the ground when viewed from above.
Time Period of Content Date 2009
Currentness Reference QuickBird satellite imagery acquired on June 25, 2009

LiDAR imagery acquired in June 2007 was available from the U.S. Army Corps of Engineers

Progress Complete
Maintenance and Update Frequency None planned
Spatial Extent of Data Minneapolis, Minnesota, USA
Bounding Coordinates -93.3298655
-93.1942786
45.0515297
44.8904741
Place Keywords Minneapolis, Minnesota, USA
Theme Keywords Urban Tree Cover, Impervious surface, QuickBird, remote sensing
Theme Keyword Thesaurus
Access Constraints The Remote Sensing and Geospatial and Analysis Laboratory, University of Minnesota, has attempted to produce accurate maps, statistics and information of urban tree cover, land cover, and impervious surface area. However, it makes no representation or warranties, either expressed or implied, for the data accuracy, currency, suitability or reliability for any particular purpose. Although every effort has been made to ensure the accuracy of information, errors and conditions originating from the source data and processing may be present in the data supplied. Users are reminded that all geospatial maps and data are subject to errors in positional and thematic accuracy. The user accepts the data “as is” and assumes all risks associated with its use. The University of Minnesota and project affiliates assume no responsibility for actual or consequential damage incurred as a result of any user's reliance on the data. The data are the intellectual property of the University of Minnesota.The Remote Sensing and Geospatial and Analysis Laboratory, University of Minnesota, has attempted to produce accurate maps, statistics and information of urban tree cover, land cover, and impervious surface area. However, it makes no representation or warranties, either expressed or implied, for the data accuracy, currency, suitability or reliability for any particular purpose. Although every effort has been made to ensure the accuracy of information, errors and conditions originating from the source data and processing may be present in the data supplied. Users are reminded that all geospatial maps and data are subject to errors in positional and thematic accuracy. The user accepts the data “as is” and assumes all risks associated with its use. The University of Minnesota and project affiliates assume no responsibility for actual or consequential damage incurred as a result of any user's reliance on the data. The data are the intellectual property of the University of Minnesota.
Use Constraints This data may be used for educational and non-commercial purposes, provided proper attribution is given. Secondary distribution of the data is permitted, but not supported by the University of Minnesota. By accepting the data, the user agrees not to transmit this data or provide access to it or any part of it to another party unless the user includes with the data a copy of this disclaimer.This data may be used for educational and non-commercial purposes, provided proper attribution is given. Secondary distribution of the data is permitted, but not supported by the University of Minnesota. By accepting the data, the user agrees not to transmit this data or provide access to it or any part of it to another party unless the user includes with the data a copy of this disclaimer.
Contact Person Information Marvin Bauer, Professor
Remote Sensing and Geospatial Analysis Lab, Univeristy of Minnesota
1530 Cleveland Avenue North
St. Paul , MN 55108
Phone: (612)624-3703
Fax: (612)625-5212
Email : mbauer@umn.edu
Browse Graphic View a sample of the data
Browse Graphic File Description
Associated Data Sets St. Paul and Woodbury datasets also availableSt. Paul and Woodbury datasets also available


Section 2 Data Quality Information Top of full metadata Top of page
Attribute Accuracy Accuracies for the state are reported in the supplemental map file titled: Minneapolis Tree Canopy Mapping - Final Report.pdf
Logical Consistency
Completeness Data provides complete coverage of Minneapolis, Minnesota, USA.Data provides complete coverage of Minneapolis, Minnesota, USA.
Horizontal Positional Accuracy LIDAR: The horizontal accuracy of the data was roughly 0.5 meters and stated to be “better than 1 meter.”
Vertical Positional Accuracy LIDAR: The vertical accuracy compared to 33 control points was 0.087 meters.
Lineage
The process is generally described as follows. QuickBird satellite imagery acquired on June 25, 2009 was used for the image classification. The image was clear and cloud-free. In addition, LiDAR imagery acquired in June 2007 was available from the U.S. Army Corps of Engineers. LiDAR (Light Detection And Ranging) is a remote sensing technology using pulses from a laser to measure the distance to the surface, and therefore can be used to generate elevation and height information. This imagery consisted of first return information as well as the last return or bare earth; using the two a normalized digital surface model (nDSM) which depicts height above bare earth (for example of buildings and trees). The horizontal accuracy of the data was roughly 0.5 meters and stated to be “better than 1 meter.” Its vertical accuracy compared to 33 control points was 0.087 meters. The LiDAR data included full coverage for the entire City of Minneapolis. The LiDAR nDSM data corresponds very closely to the buildings and trees, with the height information providing excellent separation of buildings from streets and trees from grass. In all cases the class is defined as the surface area viewed from above. It should be noted that tree canopies will cover and obscure from view some of the grass, bare soil, streets and parts of some buildings. To take one example, the amount of impervious will by definition typically be less than measured by other methods such as from “leaf-off” high resolution ortho aerial photos in which all impervious surfaces can be seen. Therefore results from the two methods should not be compared. Of the two methods, impervious area measurements from the higher resolution photos should be more accurate.

Classification Procedures:

The primary land classifications were produced using object based image analysis (OBIA) techniques available in eCognition Developer version 8.0. Ancillary software utilized included ArcGIS version 9.3.1 and ERDAS Imagine version 2010. Additional customized routines were written in Python version 2.5 scripting language to support processing as required. Shapefile information was provided by the City of Minneapolis to help identify streets, buildings, roads and highways and water features.

The following principle steps were followed to implement the project:

  • The 2.4-meter resolution multispectral QuickBird imagery was “pan-sharpened” using the 0.6-meter panchromatic band and subtractive resolution in ERDAS Imagine.
  • QuickBird Imagery was georeferenced utilizing the available RPC files and a 30-meter DEM layer.
  • LiDAR data were georeferenced to match the QuickBird imagery.
  • A customized Python script was used to divide the georeferenced imagery into 750 x 1000 meter tiles with 10 percent overlap for further processing. This step created 262 individual tiles.
  • The street layer was buffered in ArcGIS by 3 meters to create a polygon shapefile for subsequent use in eCognition.
  • The rule set was created using these process steps:
  • eCognition workspace of all 262 tiles was created with a customized load procedure.
  • Imagery was examined to locate a representative tile.
  • Supportive image layers such as Normalized Difference Vegetation Index (NDVI) and Lee’s Sigma Edge Extraction were created to aid classification efficacy.
  • Image objects were generated representing buildings, roads and water features from shapefiles and classified as such.
  • Since LiDAR data were available the images were first segmented into tall and short features.
  • Remaining portions of the image were classified utilizing algorithms available in eCognition taking advantage of spectral information as well as other elements of image interpretation such as context, shape, size, site, association, pattern, shadows and texture.
  • Classification was exported from eCognition into a TIF raster file.
  • The rule set was fine tuned and tested on additional random tiles distributed throughout Minneapolis.
  • The final rule set was used to classify all the tiles using eCognition Server.
  • Individual classified tiles were joined into a single mosaic using geometric seam lines in ERDAS Imagine Mosaic Pro.
  • The accuracy of the resulting classification was assessed in ERDAS Imagine using 1,413 stratified random points.
  • The classification mosaic was then manually examined and edited to eliminate classification errors.
  • Error corrections were re-run in eCognition Server to incorporate the corrections.
  • The final land cover mosaic was manipulated by ERDAS Imagine and ArcGIS into the output geodatabase utilizing both raster and vector forms of the data.
  • A Python script was written to summarize classification information into various shapefiles such as parcels and neighborhoods.

Key to the classification was use of an object-based image analysis approach in which the imagery was first segmented into objects with similar pixels based on the spatial, as well as the spectral-radiometric (color) attributes. Research has shown that it is the best approach for classification of high resolution imagery (Blaschke, 2010; Platt and Rapoza, 2008). Objects include more information than individual pixels, enabling the ability to take advantage of all the elements of image interpretation, particularly spatial information, including shape, size, pattern, texture, and context. Context is especially useful. Humans intuitively integrate “pixels” into objects and use contextual relationships to interpret images and draw intelligent inferences from them. Ancillary data such as GIS layers, for example, of streets and water bodies, can also be incorporated into the decision rules.
The object based image analysis process in eCognition can broadly be split into two components, segmentation and classification. Segmentation primarily uses spectral information about individual pixels in the imagery to combine them into larger image objects or segments. As an example, individual pixels which comprise the roof of a building with similar brightness, normalized difference vegetation index (NDVI) and color values are combined to form an image object that represents the building. Other scaling information can be specified to regulate the size range of the desired objects. Once these image objects are created, they can be classified using a multitude of decision rules which utilize not only their spectral characteristics but also spatial information such as shape, size, proximity to other object types, texture, and context. The overall process is dependent on the quality of the initial segmentation into image objects.

Accuracy Assessment:

Accuracy assessment was performed after the tiles were edited for misclassifications by generating stratified random points across the image and comparing the classified results to reference imagery of color ortho photos provided by the City and imagery from ArcGIS online. Stratified random point selection assures each class will be weighted proportionately to the total number of points in that class across the image. There were 1,413 points in the sample.
The assessment points are displayed large enough to be visible on the map, but in reality these points are geometric points that ERDAS Imagine randomly designates in the image.

Results:
The results show that 31.5 percent of the area of the City is tree canopy.

Tabulation of the percent area of each of the seven land cover classes.
Land Cover Class
Percent
1. Tree Canopy
31.5
2. Grass and Shrubs
19.7
3. Bare Soil
0.2
4. Water
6.2
5. Buildings
15.5
6. Streets
9.5
7. Other Impervious
17.3

We also conducted quantitative assessments of the classification by comparing a stratified random sample of points from the classification to high resolution aerial photography. The results are presented in the form of a contingency table or error matrix; further details on interpretation of error matrices and the statistics derived from them are in Appendix 1 of the final report associated with this map.
Our previous work had shown that automated object based classification, while effective, can still result in obvious misclassifications. To reduce this impact, the classifications were compared to reference imagery and were edited manually where necessary. As an example, grass near freeways can become quite dry and take on the appearance of impervious cover. Larger objects with height such as trucks and buses on roads are often interpreted as a building. We assessed the accuracy after these corrections were made and the results are shown in Table 3. The overall accuracy was 91.9 percent.

Source Scale Denominator


Section 3 Spatial Data Organization Information Top of full metadata Top of page
Native Data Set Environment eCognition Developer version 8.0, ArcGIS version 9.3.1, and ERDAS Imagine version 2010
Geographic Reference for Tabular Data
Spatial Object Type Raster
Vendor Specific Object Types
Tiling Scheme


Section 4 Spatial Reference Information Top of full metadata Top of page
Horizontal Coordinate Scheme Universal Transverse Mercator
Ellipsoid Geodetic Reference System 80
Horizontal Datum NAD83
Horizontal Units Meters
Distance Resolution 30
Altitude Datum Not applicable
Depth Datum Not applicable
Cell Width
Cell Height
UTM Zone Number 15N


Section 5 Entity and Attribute Information Top of full metadata Top of page
Entity and Attribute Overview Although the pixel size of the pan-sharpened QuickBird imagery is approximately 0.6 meters, the lower limit for size detection of individual objects is between 2 and 3 meters square. More specifically, to improve the spatial resolution of the multispectral imagery we used a pan-sharpening process which 13
takes the spectral information from the 2.4-meter multispectral pixels and distributes it mathematically to the higher resolution 0.6-meter panchromatic pixels to create 0.6-meter multispectral pixels. While the pixel size is 0.6 meters, small or narrow objects (e.g., a sidewalk) may not be resolved in the imagery or classification.
Entity and Attribute Detailed Citation


Section 6 Distribution Information Top of full metadata Top of page
Publisher Remote Sensing and Geospatial Analysis Lab, Univeristy of Minnesota
Publication Date 04/12/2011
Contact Person Information Marvin Bauer, Professor
Remote Sensing and Geospatial Analysis Lab, Univeristy of Minnesota
1530 Cleveland Avenue North
St. Paul , MN 55108
Phone: (612)624-3703
Fax: (612)625-5212
Email: mbauer@umn.edu
Distributor's Data Set Identifier mpls_final_classification_x.img
Distribution Liability This data may be used for educational and non-commercial purposes, provided proper attribution is given. Secondary distribution of the data is permitted, but not supported by the University of Minnesota. By accepting the data, the user agrees not to transmit this data or provide access to it or any part of it to another party unless the user includes with the data a copy of this disclaimer.This data may be used for educational and non-commercial purposes, provided proper attribution is given. Secondary distribution of the data is permitted, but not supported by the University of Minnesota. By accepting the data, the user agrees not to transmit this data or provide access to it or any part of it to another party unless the user includes with the data a copy of this disclaimer.
Transfer Format Name HFA/Erdas Imagine Images (.img)
Transfer Format Version Number
Transfer Size
Ordering Instructions see website or contact infosee website or contact info
Online Linkage Click here to download data. (See Ordering Instructions above for details.) By clicking here, you agree to the notice in "Distribution Liability" in Section 6 of this metadata.


Section 7 Metadata Reference Information Top of full metadata Top of page
Metadata Date 12/05/2006
Contact Person Information Marvin Bauer, Professor
Remote Sensing and Geospatial Analysis Lab, Univeristy of Minnesota
1530 Cleveland Avenue North
St. Paul , MN 55108
Phone: (612)624-3703
Fax: (612)625-5212
Email: mbauer@umn.edu
Metadata Standard Name Minnesota Geographic Metadata Guidelines
Metadata Standard Version 1.2
Metadata Standard Online Linkage http://www.gis.state.mn.us/stds/metadata.htm


This page last updated 12/05/2006 Go back to top