Research on the pixel-based and object-oriented methods of urban feature extraction with GF-2 remote-sensing images

8 Mar 2019  ·  Dong-dong Zhang, Lei Zhang, Vladimir Zaborovsky, Feng Xie, Yan-wen Wu, Ting-ting Lu ·

During the rapid urbanization construction of China, acquisition of urban geographic information and timely data updating are important and fundamental tasks for the refined management of cities. With the development of domestic remote sensing technology, the application of Gaofen-2 (GF-2) high-resolution remote sensing images can greatly improve the accuracy of information extraction. This paper introduces an approach using object-oriented classification methods for urban feature extraction based on GF-2 satellite data. A combination of spectral, spatial attributes and membership functions was employed for mapping the urban features of Qinhuai District, Nanjing. The data preprocessing is carried out by ENVI software, and the subsequent data is exported into the eCognition software for object-oriented classification and extraction of urban feature information. Finally, the obtained raster image classification results are vectorized using the ARCGIS software, and the vector graphics are stored in the library, which can be used for further analysis and modeling. Accuracy assessment was performed using ground truth data acquired by visual interpretation and from other reliable secondary data sources. Compared with the result of pixel-based supervised (neural net) classification, the developed object-oriented method can significantly improve extraction accuracy, and after manual interpretation, an overall accuracy of 95.44% can be achieved, with a Kappa coefficient of 0.9405, which objectively confirmed the superiority of the object-oriented method and the feasibility of the utilization of GF-2 satellite data.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here