Advanced
Calculation of Tree Height and Canopy Crown from Drone Images Using Segmentation
Calculation of Tree Height and Canopy Crown from Drone Images Using Segmentation
Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography. 2015. Dec, 33(6): 605-614
Copyright © 2015, Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • Received : December 12, 2015
  • Accepted : December 12, 2015
  • Published : December 31, 2015
Download
PDF
e-PUB
PubReader
PPT
Export by style
Share
Article
Author
Metrics
Cited by
TagCloud
About the Authors
Ye Seul Lim
Member, Dept. of Smart ICT Convergence, Konkuk University (E-mail:yesullim@konkuk.ac.kr)
Phu Hien La
Dept. of Advanced Technology Fusion, Konkuk University (E-mail:hien.phu.la@gmail.com)
Jong Soo Park
Korea Asset Management Corporation (E-mail:survey@kamco.or.kr)
Mi Hee Lee
Member, Dept. of Advanced Technology Fusion, Konkuk University (E-mail:mihee7586@konkuk.ac.kr)
Mu Wook Pyeon
Member, Dept. of Civil Engineering, Social Echo-Tech Institute, Konkuk University (E-mail:neptune@konkuk.ac.kr)
Jee-In Kim
Corresponding Author, Member, Dept. of Smart ICT Convergence, Social Echo-Tech institute, Konkuk University (E-mail:jnkm@konkuk.ac.kr)
Abstract
Drone imaging, which is more cost-effective and controllable compared to airborne LiDAR, requires a low-cost camera and is used for capturing color images. From the overlapped color images, we produced two high-resolution digital surface models over different test areas. After segmentation, we performed tree identification according to the method proposed by La et al. (2015) , and computed the tree height and the canopy crown size. Compared with the field measurements, the computed results for the tree height in test area 1 (coniferous trees) were found to be accurate, while the results in test area 2 (deciduous coniferous trees) were found to be underestimated. The RMSE of the tree height was 0.84 m, and the width of the canopy crown was 1.51 m in test area 1. Further, the RMSE of the tree height was 2.45 m, and the width of the canopy crown was 1.53 m in test area 2. The experiment results validated the use of drone images for the extraction of a tree structure.
Keywords
1. Introduction
Increasing numbers of studies are being pursued on forests and green spaces to reduce the carbon dioxide levels in the atmosphere, with the aim of mitigating global warming ( Lee ., 2014 ). However, because a forest area is not easily accessible and has a large number of trees, we need to find an economic and accurate method to acquire tree information such as the heights and the diameters of trees ( Chang ., 2006 ). Methods for the automatic extraction of the tree area and height can be grouped by source types as follows: (1) spectral data of satellite images or aerial photographs, (2) airborne LiDAR(LIght Detection And Ranging) data to provide information about thevertical structure of a forest, and (3) combined data of (1) and (2) ( Chang ., 2012 ; Chang ., 2006 ; Lee and Ru, 2012 ). Further, tree extraction with NDVI(Normalized Difference Vegetation Index) has been studied, but because the spectral reflectance tends to be more correlated with the tree leaf density than the tree height, it is difficult to obtain individual tree heights ( Chopping ., 2008 ). Therefore, a high-resolution DSM(Digital Surface Model) generation, which can support individual tree identification and compute the height of each tree, is important for the extraction of tree attributes ( Zarco-Tejada ., 2014 ).
Standard methodologies used for individual tree detection are based on photogrammetric data and, more recently, on the use of LiDAR data. Both require expensive cameras, well-trained personnel, and precise technology to obtain accurate results ( Zarco-Tejada ., 2014 ). Further, these data usually cover a large area. Therefore, the use of these techniques to investigate a small area or acquire data with very high resolution (spatial resolution of image lower than 10cm, or LiDAR point density of more than 10points/m 2 ) is expensive. Moreover, advances in the fields of the UAV(Unmanned Aerial vehicle) technology and data processing have made it feasible to obtain very high-resolution imagery and 3D(tree-dimensional) data ( Kattenborn ., 2014 ). In fact, recent studies have demonstrated the capability of UAVs with respect to forest inventory ( Hung ., 2012 ; Zarco-Tejada ., 2014 ). However, these studies have used only spectral data and height data; this might reduce the accuracy of information extraction.
In this study, we obtained color images by using a low-cost camera mounted on a remote-controlled drone and then, generated a high-resolution ortho-image and DSM in the overlapped area. Then, nDSM(Normalized Digital Surface Model) was generated by subtracting the available DTM(Digital Terrain Model) from the DSM. The individual trees were extracted on the basis of the segmentation of the fused image, which was generated by combining the nDSM and the color image. Next, the tree heights and crown widths were computed. These results were compared with the field measurements.
2. Data and Method
- 2.1 Test area and images
For the experiment, aerial images were taken by a drone camera around Sanghuh Memorial Library at Konkuk University on December 15, 2015. The drone used (Phantom-3 Professional, Djibouti) had four propellers, a camera, a GPS(Global Positioning System) receiver, and a gimbal (Fig. 1(a) ). Further, it had an exclusive remote controller. The camera used for the experiment can take 1.2M-pixel images and video with 4K (3840 × 2160) images. The test areas had two different tree species. The species in test area 1 was Picea abies of Pinaceae, and the species in test area 2 was Metasequoia glyptostroboides ( M. glyptostroboides ) of Taxodiaceae ( Figs. 1(b) and (c) ).
PPT Slide
Lager Image
Photographing platform and mosaic imaged of test areas 1 and 2
The photographing conditions and accuracy are provided in Pix4D Quality Report. The flight height was 45m and 55m and the spatial resolution was 0.018m and 0.022m in test areas 1 and 2, respectively. The drone was manually controlled, and the frame image size was 4000×3000 pixels. The photographed area was 0.0088km 2 for test area 1 and 0.0152km 2 for test area 2. Furthermore, we ensure that the same areas were photographed more than 5 times for DSM production.
In this study, an ortho-mosaic image and DSM were generated by using Pix4D software in the fully automatic mode. The imagery was synchronized using the GPS position, and the triggering time recorded for each image. Only absolute GPS coordinates were used in this project for the generation of ortho-mosaics and DSMs; no GCP(Ground Control Point) was used. The derived ortho-image and DSM were resampled to 2cm. Some specific information of the drone-image processing conducted in Pix4D is presented in Table 1 .
Summary of drone-image processing using Pix4D
PPT Slide
Lager Image
Summary of drone-image processing using Pix4D
For the images taken by the drone, the external expression elements were determined by bundle block adjustment. Noise/smoothing filtering was done in the process to automatically produce the 3D height on the overlapped areas. Further, DSM was produced by inverse distance weighting.
- 2.2 Extraction of tree structure
In this study, we used the method described in La . (2015) to extract individual tree crowns from the high-resolution DSM and ortho-image mosaic. These DSM data were generated using 30 (test area 1) and 43 (test area 2) frame images. The overall experimental flowchart is illustrated in Fig. 2 1) The RGB(Red-Green-Blue) ortho-image and the DSM were derived by analyzing stereo-images taken by the drone using the Pix4D software. 2) The tree area was extracted on the basis of the classification of the RGB ortho-image. 3) The nDSM was generated by subtracting the available DTM from the DSM. nDSM was generated by subtracting DTM from the DSM. The DSM was automatically generated from the drone-image analysis conducted by using the Pix4D software. The DTM, which was built using the LiDAR data, was the available data in this study. DTM was subtracted from DSM by using the spatial modeler of ERDAS IMAGINE. 4) The RGB ortho-image and the nDSM were combined through layer stacking. 5) Segmentation available in the ERDAS IMAGINE software was carried out on these integrated data. 6) The tree position, tree height, and crown diameter were extracted by segments. 7) The experimental results were compared to the field data in order to assess accuracy.
PPT Slide
Lager Image
Experimental flowchart
Segmentation is a method of partitioning raster images into several segments on the basis of pixel values and locations. Pixels that are spatially connected and have similar values are grouped into a single segment. Image segmentation methods have been applied to conventional aerial photography for the identification of individual tree crowns ( Gougeon and Leckie, 2003 ; Suárez ., 2005 ). In our study, the available “Lambda Schedule Segmentation” in ERDAS IMAGINE was used. This applies a bottom-up merging algorithm and considers the spectral content as well as the segment’s texture, size, and shape for merging decisions. The result is a thematic image in which the pixel values represent the class IDs of contiguous raster objects ( ERDAS IMAGINE Help, 2015 ).
3. Results and Analysis
- 3.1 Individual tree identification
Fig. 3(a) shows the DSM (on the left) established from drone images by using Pix4D, and the segmentation result (on the right) overlapping on the RGB ortho-image in study area 1. Similarly, Fig. 3(b) shows the DSM (on the left) and the segmentation result (on the right) overlapping the RGB ortho-image in study area 2.
PPT Slide
Lager Image
Segmentation result of test areas 1 and 2
Eleven trees (marked with red circles) were clearly identified from a total of thirteen trees, as shown in Fig. 4(a) , and two trees (marked with yellow circles) were not recognized. One additional tree that was not targeted was identified (marked with a purple circle), as shown in Fig. 4(b) . Unidentified tree A had a small canopy crown with compared to the trees on both sides, as shown in Fig. 4(c) . In particular, we believe that the tree that has such a small width towards the treetop cannot be recognized. However, although tree B had a relatively wide canopy crown, it was not recognized because the center of the treetop is not definite with respect to the other trees, as shown in Fig. 4(d) . However, the tree marked with a purple circle had a small canopy crown but could be identified because the width of the treetop canopy was definite.
PPT Slide
Lager Image
Tree identification results of test area 1
Ten trees were identified in test area 2, as shown in Fig. 5(a) . Nine trees marked with red and purple circles were identified, as shown in Fig. 5(b) . The tree marked with a yellow circle was detected as a single object, which was originally two trees. A certain tree, which had relatively few leaves compared to the trees on both sides, as shown in Fig. 5(c) , could not be identified.
PPT Slide
Lager Image
Tree identification results of test area 2
- 3.2 Tree height
After performing the segmentation process, the resultant images were converted into shape files. The polygon parameters including the position of the centroid and the polygon area were derived by using ESRI ArcGIS. This information was used for estimating the individual tree parameters. The location of the tree crown was represented by the centroid of each tree segment. The individual tree height was determined from the highest value of the nDSM within the segmented polygon ( Hyyppa ., 2001 ; La ., 2015 ).
Tree height measurement is classified into triangle similarity, triangulation, and distance measurement. The tools generally used include Weise Hypsometer, Transit, TS(Totalstation), Hega Hypsometer, and Sunto Hypsometer. In this study, we measured the tree height by using TS that had relatively high accuracy and used the triangulation principle.
In this experiment, we compared the tree heights determined using the drone’s frame images to the reference data (11 trees) acquired using a TS. Table 2 presents the tree height measured by the drone, the tree height measured by TS, and the difference between these values. Fig. 6 illustrates these values in a graphical form. As shown in Fig. 6(a) , the tree height measured by the drone and that by TS did not show a significant difference. However, Tree No. 1 and 5 showed a significant difference ( Fig. 6(b) ). An error in measuring the height of Tree No. 1 by TS was considered possible because this tree stood slightly back from the other trees. Tree No. 5 looked as tall as Tree No. 6 by eye measurement, but it was identified to be taller than Tree No. 6 by the drone. It may be influenced by Tree B ( Fig. 4 ) right next to it. Since the drone draws a circle and measures the highest point when measuring the tree height, it seems that the height of Tree B next to Tree No. 5 was measured. The difference in the average values measured using the TS and the drone was just about 0.5 m. Further, R 2 was 0.91 and RMSE (Root Mean Square Error) was 0.84 for test area 1. The tree height measured by TS and that by the drone did not show a significant difference.
Tree heights in test area 1 (unit: m)
PPT Slide
Lager Image
(Drone: Computation results from drone images, TS: Field measurements using TS, Difference: Absolute value of difference between the value measured by TS and that by drone)
PPT Slide
Lager Image
Tree heights obtained using drone and using TS and difference in test area 1
Table 3 presents the tree height measured by the drone, that by TS, and the difference between them. Fig. 7(a) illustrates the values given in Table 3 in a graphical form. Fig. 7(b) shows the graph of the difference between the tree height measured by the drone and that by TS. In Fig. 7 , Tree No. 2 particularly showed a considerable difference. While Tree No. 2 looked far taller than Tree No. 1 even by eye measurement, the tree heights of Tree Nos. 1 and 2 as measured by the drone did not show a significant difference. This indicates that the tree heights measured by the drone had errors. In test area 2, the tree heights computed using the drone’s frame images were underestimated compared to the reference data (six trees). The average value obtained using TS was 26.85m, and the average value obtained using the drone’s frame image was 24.78m. The difference between these two average values was larger than that of test area 1. This is caused by the difference between species distributed in two areas. While the coniferous tree ( Picea abies ) of test area 1 was not affected by the season, a number of leaves in test area 2 had already dropped because of the seasonal effects. Therefore, the matching points of the test area were not sufficient to extract the tree height precisely. R 2 was 0.85 and RMSE was 2.77 in the test area.
Tree heights in test area 2 (unit: m)
PPT Slide
Lager Image
Tree heights in test area 2 (unit: m)
PPT Slide
Lager Image
Tree heights obtained using drone and using TS and difference in test area 2
- 3.3 Canopy crown width
The diameter of the crown could be calculated using the area of the corresponding segment ( Hyyppa , 2001 ; La ., 2015 ), as shown in Eq. (1):
PPT Slide
Lager Image
where D : denotes the diameter of an individual tree, and A : represents the area of the segment
To filter out the incorrectly estimated tree parameters, the height and diameter thresholds were applied. On the basis of the derived values, tree locations, and crown diameters, the tree crowns were reconstructed.
Table 4 presents the crown width measured by the drone, that measured by the tape measure, and the difference between both the values. Fig. 8(a) shows the graph generated using the values presented in Table 4 . Fig. 8(b) illustrates the graph of the difference between the values measured by TS and those measured by the drone.
Crown width of test area 1 (unit: m)
PPT Slide
Lager Image
(Drone: Computation results from drone images, Tapeline: Measured crown width with a tapeline, Difference: Absolute value of difference between value measured by TS and that by drone)
PPT Slide
Lager Image
Computed crown width and field measurement and difference between the two
The canopy crown width is a measure of the linear distance across the projection surface of the tree canopy. The average value of the minimum width and the maximum width is used as the canopy crown width because it has a mostly oval or irregularly shape. In this study, the canopy crown width as reference data was acquired by a measuring tape using the same method. Table 5 presents the crown width measured by the drone, that measured by the tape measure, and the difference between both the values. Fig. 9(a) shows the graph generated using the values presented in Table 5 . Fig. 9(b) illustrates the graph of the difference between the values measured by TS and those by the drone. In the accuracy assessment, the canopy crown width of test areas 1 and 2 was underestimated as compared to the reference data. This can be attributed to the fact that the canopy crown has the shape of a horn and the drone captures only the upper part of the canopy crown. Further, the overlap between the canopy crowns may have led to the underestimation of the canopy crown in the extraction process. The RMSE of site 1 was 1.51m, and the RMSE of site 2 was 1.54m.
Crown width of test area 2 (unit: m)
PPT Slide
Lager Image
Crown width of test area 2 (unit: m)
PPT Slide
Lager Image
Computed crown width and field measurement and difference between the two
4. Conclusion
Using the on-board digital camera of the remotely controlled drone, we computed the numbers, heights, and canopy width of trees. As experimental results from the test areas of the Konkuk University campus, the computed results were close to field measurements. The average difference between the tree height measured by TS and that by the drone was 0.53m in test area 1 and 2.07m in test area 2. The average difference between the crown width measured by TS and that by the drone was 1.27m in test area 1 and 1.33m in test area 2. The crown width measured by TS and that by the drone did not show a significant difference. In particular, the tree heights measured in test area 1 did not show a significant difference as 0.53. Unlike test area 1, M. glyptostroboides in test area 2, the deciduous coniferous forest, was not properly identified because a large number of leaves fell down during the winter when the images were taken, that is, because of the seasonal effect. In consideration of such a seasonal influence, the results of this paper suggest a high possibility of measuring tree heights using images.
Some errors in tree identification were observed in the DSM segmentation. This showed that the input parameters of the segmentation were sensitive to the segmentation result. If the characteristics of the tree extraction area were known roughly, we could have decreased the amount of errors by an introduction of the initial values. Therefore, we expect the proposed method and the corresponding results to contribute to the estimation of the quantity of biomass and carbon emission.
Acknowledgements
This research was supported by the Ministry of Trade, Industry and Energy (MOTIE), Korea, through the Education Program for Creative and Industrial Convergence (Grant Number N0000717).
References
Chang A.J. , Kim Y.I. , Lee B.K. , Yu K.Y. (2006) Estimation of individual tree and tree height using color aerial photograph and LiDAR data Korean Journal of Remote Sensing 22 (6) 543 - 551
Chang A.J. , Kim Y.M. , Kim Y.I. , Lee B.K. , Eo Y.D. (2012) Estimation of canopy cover in forest using KOMPSAT-2 satellite images Journal of the Korean Society for Geospatial Information System 20 (1) 83 - 91
Chopping M. , Moisen G. , Su L. , Laliberte A. , Rango A. , Martonchik J.V. , Petersm D.P. (2008) Large area mapping of south western forest crown cover, canopy height, and biomass using MISR Remote Sensing of Environment 112 (5) 2051 - 2063
Hexagon Geospatial (2015) ERDAS IMAGINE Help http://www.hexagongeospatial.com/support/documentation
Gougeon F.A. , Leckie D.G. (2003) Forest Information Extraction from High Spatial Resolution Images Using an Individual Tree Crown Approach, Canadian Forest Service. Information Report
Hung C. , Bryson M. , Sukkarieh S. (2012) Multi-class predictive template for tree crown detection ISPRS Journal of Photogrammetry and Remote Sensing 68 170 - 183
Hyyppa J. , Kelle O. , Lehikoinen M. , Inkinen M. (2001) A segmentation-based method to retrieve stem volume estimates from 3-D tree heigh models produced by laser scanners IEEE Trans on 39 (5) 969 - 975
Kattenborn T. , Sperlich M. , Bataua K. , Koch B. (2014) Automatic single palm tree detection in plantations using UAV-based photogrammetric point clouds ISPRS The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences Zurich, Switzerland 5-7 September XL-3 139 - 144
La H.P. , Eo Y.D. , Chang A.J. , Kim C.J. (2015) Extraction of individual tree crown using hyperspectral image and LiDAR data KSCE Journal of Civil Engineering 19 (4) 1078 - 1087
Lee H.J. , Ru J.H. (2012) Application of LiDAR data & high-resolution satellite image for calculate forest biomass Journal of the Korean Society for Geospatial Information System 20 (1) 53 - 63
Lee S.J. , Park J.Y. , Kim E.M. (2014) Development of automated model of tree extraction using aerial LiDAR data Journal of the Korea Academia- Industrial Cooperation Society 15 (5) 3213 - 3219
Suárez J.C. , Ontiveros C. , Smith S. , Snape S. (2005) Use of airborne LiDAR and aerial photography in the estimation of individual tree heights in forestry Computers and Geosciences 31 (2) 253 - 262
Zarco-Tejada P.J. , Diaz-Varela R. , Angileri V. , Loudjani P. (2014) Tree height quantification using very high resolution imagery acquired from an unmanned aerial vehicle (UAV) and automatic 3D photo-reconstruction methods European Journal of Agronomy 55 89 - 99