Skip to main content

Calculating coniferous tree coverage using unmanned aerial vehicle photogrammetry

Abstract

Unmanned aerial vehicles (UAVs) are a new and yet constantly developing part of forest inventory studies and vegetation-monitoring fields. Covering large areas, their extensive usage has saved time and money for researchers and conservationists to survey vegetation for various data analyses. Post-processing imaging software has improved the effectiveness of UAVs further by providing 3D models for accurate visualization of the data. We focus on determining the coniferous tree coverage to show the current advantages and disadvantages of the orthorectified 2D and 3D models obtained from the image photogrammetry software, Pix4Dmapper Pro—Non-Commercial. We also examine the methodology used for mapping the study site, additionally investigating the spread of coniferous trees. The collected images were transformed into 2D black and white binary pixel images to calculate the coverage area of coniferous trees in the study site using MATLAB. The research was able to conclude that the 3D model was effective in perceiving the tree composition in the designated site, while the orthorectified 2D map is appropriate for the clear differentiation of coniferous and deciduous trees. In its conclusion, the paper will also be able to show how UAVs could be improved for future usability.

Introduction

The use of UAVs in ecological studies

Unmanned aerial vehicles (UAVs) or drones have evolved into one of the most promising tools in recent years owing to their accurate real-time data acquisition ability (Govorčin et al. 2014). Researchers have established an era of UAV-aided ecological studies considering their effectiveness and affordability. UAVs have long been applied in remote sensing applications, for mapping land coverage, monitoring deforestation and vegetation changes, approaching inaccessible areas, etc. The reason for UAV technology to gain momentum in forest conservation is due to the need for little hands on experience and safe remote operation. Commercially available UAVs can cover large areas and are compatible with post-processing software such as Pix4D. Thus, it has rooted its application in otherwise slow and labour-intensive processes.

Serge A. Wich and Lian Pin Koh, cofounders of the site ConservationDrones.org, were one of the first to monitor and protect the world’s forest and wildlife using conservation drones (Koh 2013). We have discussed the UAVs’ potential for data acquisition in ecology and conservation biology in our previous publication (Ivošević et al. 2015). Paneque-Gálvez et al. presented the feasibility and potential of small drones for community-based forest monitoring (Paneque-Gálvez et al. 2014). They identified the current constrains and challenges in usage of UAV technology for forest monitoring and implied that they will be surmounted with technological growth. Using high-resolution imagery and object-based image analysis, Getzin et al. (2012) have assessed the forest biodiversity with respect to gaps in canopy and soil composure variation.

The aerial photography techniques have bloomed in forest monitoring applications due to the availability of post-processing software which can compute 3D models from mere 2D images. Digital terrain modelling (DTM) and digital surface modelling (DSM) are used in order to make a 3D topography with the use of point cloud and triangle meshing techniques (Vallet et al. 2011; Küng et al. 2011). Another conservation study based on UAV technology has indicated the linkage between disease transmission and environmental factors such as deforestation. The researchers have used Pix4D to map the area of deforestation and deterioration of species habitat (Forance et al. 2014). The quality of composed 3D structure depends on image overlapping, picture size and distinct visual content (Introduction Pix4D Webinar 1 2016). The accuracy of UAV photogrammetry is limited to the ability of the software to distinguish edges and texture (Küng et al. 2011; Introduction Pix4D Webinar 1 2016). While UAVs have been successfully used to list tree inventory and to measure their heights (Lisein et al. 2013), present-day image processing techniques are not applicable when millimetre scale accuracy is required.

The overall goal of the study was to determine whether it is possible to accurately detect coniferous species in the designated study site using the default Phantom 2 Vision+ platform and Pix4Dmapper Pro—Non-Commercial and calculate the percentage of the area covered by conifers using MATLAB.

Phantom 2 Vision+

Phantom 2 Vision+ is simple to set up and an easy to fly unmanned aerial quadcopter developed by DJI, a Chinese company headquartered in Shenzhen, China (Fig. 1) (Dji.com 2016). The compact, highly integrated platform Phantom 2 Vision+ has an all-in-one system, with a three-axis stabilization gimbal holding, including a 14-MP HD camera, and 4 GB micro SD card for data storage. It also allows for tilting of the camera as you fly, creating unique angles, and gives the option of slow-motion shots and live view via a dedicated mobile application (Dji.com 2014).

Fig. 1
figure 1

Phantom 2 Vision+ quadcopter starts the mission automatically using Pix4D capture mapper application

Pix4D software

Pix4D is an UAV, ground and aerial, image photogrammetry software. Founded by a Swiss company in 2011, Pix4D has become the main provider and industry standard for professional unmanned aerial vehicle (UAV) processing software (Pix4D 2016a). The Pix4D computing consists of three steps: initial processing, point cloud densification and DSM and orthomosaic generation, which are done automatically.

Pix4D software also provides the access of the support site and forum where the support team has put together, and is constantly updating, a wealth of information related to the Pix4D, academy video tutorials, webinars, example datasets and more (Pix4D 2016b).

Materials and methods

Study area

The study area was designated in Sobaeksan, Republic of Korea, more specifically Namcheon Valley, in Danyang. The location of the study site is given in Fig. 2, derived from the Pix4Dmapper Pro—Non-Commercial software. In 1987, Sobaeksan was designated as National Park No. 18. It covers an area of 320.50 km2. Featuring beautiful valleys and ridgelines, Sobaeksan National Park is known for its abundant wildlife and breath-taking nature (English.visitkorea.or.kr 2016).

Fig. 2
figure 2

Google earth representation of the study site generated using Pix4D software. The figure represents mapped area of the study site with the details of the exact location

Two flights have been conducted at Sobaeksan, encompassing the small forest area (Table 1). The flights were set up to the automatic mode for flying linear transects with the dimensions of 70 × 70 m2, altitude 70 m and 70 × 70 m2, altitude 60 m, using the Pix4Dcapture mapper application (Fig. 3). These line transects are set by the software so as to obtain high image overlap which is needed for creating good 2D and 3D models. The surveyed area can be easily expended by either flying multiple missions or by increasing the altitude. Note that increasing the altitude will sacrifice image details. Thus, we cover our study site with low altitudes. The dominant tree species are Pinus densiflora, Quercus mongolica, Quercus variabilis and Pinus koraiensis. The best coverage of the chosen area was acquired when the images were taken from an altitude of 60 and 70 m, with the camera angle at 90°, facing the ground. This could be attributed to the possibility that the image overlapping was good for the post-processing of the image to form 3D maps. The data acquisition was done on 10 June 2015, and in total, 56 pictures were used to derive the results.

Table 1 Automatic flight details for image acquisition
Fig. 3
figure 3

Automatic flight mission grid of the UAV. The UAV follows the grid lines automatically set by Pix4D capture mapper application

Data processing

Automatically taken images were synchronized with the SD card, from the UAV, to the Pix4D application and downloaded to the personal computer after each data collection. Data processing steps are given in Fig 4. The uploaded pictures were input to Pix4D to generate 2D and 3D maps. The copies of the individual pictures were used to identify and colour the coniferous trees in red. Marking the coniferous trees was executed using Photoshop CS6 v13.0. The Color Replacement Tool was used to saturate the coniferous tree colour to red shades (foreground colour: R,G,B = 243,7,11). The different shades of red were a consequence of the tool used and does not denote any information. These modified pictures were processed using Pix4D to generate 2D and 3D maps with coniferous trees marked in red. These maps give an accurate visualization of the spread of conifers in the study site. Post-processing using MATLAB helped to calculate coverage area of the coniferous trees for the particular study site.

Fig. 4
figure 4

Flowchart of data processing. Steps involved in making and calculating the coniferous tree coverage area

Results

Generated map output

Visual clues on coniferous tree coverage are obtained from 2D and 3D maps (Fig. 5ad). Distance and area coverage can be extracted easily and more accurately from 2D maps while the depth perception can be obtained from 3D maps. They can be used as a first step in approaching any ecological survey. The later part of this article discusses how the coniferous tree coverage was estimated using a 2D map while its position was visualized using a 3D map.

Fig. 5
figure 5

2D and 3D map representation of the study site. a 2D and 3D maps generated from unmodified pictures using PixD software. b 2D and 3D maps generated from coloured pictures processed in Pix4D software. c 3D map generated from the unmodified pictures. d 3D map generated from coloured pictures

Coniferous coverage area estimation

From the Pix4D quality report, it was evident that there was high image matching and the total covered area was 1820 m2. Pix4D outputs the generated 2D map with a dimension of 1194 × 1807 pixels. The study site area in the 2D map image was made pure white and the remaining portion was made pure black using Photoshop (Fig. 6). The number of white pixels was found using MATLAB and area per pixel was calculated with the formula:

Fig. 6
figure 6

Shading study site area with white. The study site area is shaded in white while the rest of the parts are made black using Photoshop in order to find the area per pixel using MATLAB

$$ \mathrm{Area}\kern0.24em \mathrm{per}\;\mathrm{pixel}=\frac{\mathrm{Total}\;\mathrm{area}\;\mathrm{of}\;\mathrm{the}\;\mathrm{s}\mathrm{tudy}\;\mathrm{s}\mathrm{ite}}{\mathrm{Number}\;\mathrm{of}\;\mathrm{white}\;\mathrm{pixel}\mathrm{s}} $$

The original 2D map was again taken, and the red-shaded parts were made true white while the remaining parts were made true black (Fig. 7). This highlights only the coniferous tree coverage area. The area covered by conifers is given by

Fig. 7
figure 7

Shading coniferous tree covered areas as white. The red-shaded parts, which indicate coniferous trees, are shaded in white while the remaining parts are shaded in black. This is used to estimate the coverage area of the coniferous trees

$$ \begin{array}{l}\mathrm{Area}\;\mathrm{covered}\;\mathrm{by}\;\mathrm{coniferous}\;\mathrm{trees}=\mathrm{White}\;\mathrm{pixel}\mathrm{s}*\mathrm{Area}\;\mathrm{per}\;\mathrm{pixel}\\ {}\mathrm{Percentage}\;\mathrm{coverage}\;\left(\%\right)=\frac{\mathrm{Area}\;\mathrm{covered}\;\mathrm{by}\;\mathrm{coniferous}\;\mathrm{trees}}{\mathrm{Total}\;\mathrm{area}\;\mathrm{of}\;\mathrm{the}\;\mathrm{s}\mathrm{tudy}\;\mathrm{s}\mathrm{ite}}*100\end{array} $$

Workflow of the calculation is shown in Fig. 8. The coverage area of the coniferous trees in the study site was calculated to be 291.5814 m2. This, when compared to the total area of the study site of 1820 m2, gives a conifer coverage percentage of 16.0209%.

Fig. 8
figure 8

Flowchart of MATLAB processing to calculate coniferous trees area coverage. The area covered by a single pixel is found first. It is used to calculate the area covered by the coniferous trees by finding the number of pixels that the coniferous area covers and then multiplying it with the area covered per pixel

The images which were made to true white and black colours using Photoshop were also transformed to binary black and white using im2bw command in MATLAB. This two-step image conversion was used as a simple process to avoid any chance of grayscales, other than black and white, from arising while using the im2bw command. Although López-Fernández et al. has used focal length and altitude of flight to measure the area coverage of solar panels in their study, the same method was not applicable due to the subtle variation in altitude caused by wind (López-Fernández 2015). We therefore sort to a manual method of marking and calculating the coverage. Automatic detection of coniferous trees was not executed similar to the methods used by Dalponte et al. and Puerto et al. due to the unavailability of high-level image processing techniques (Dalponte et al. 2015, Puerto et al. 2015). Nevertheless, it shows promise for automation.

Utilities

The benefits of using UAV technology in ecological research applications are numerous. Phantom 2 Vision+ is a small sized, nimble and light UAV suitable for reaching the inaccessible areas for human foot. In addition, this particular UAV belongs to quadcopter UAVs whose advantage is the ability of vertical take-off and landing. Its capability to fly pre-programmed autonomous missions and acquire data on the go automatically sets it apart from the conventional methods of surveying. The bird’s-eye view of the chosen study site provides a good coverage and insight of the spread of coniferous trees.

2D and 3D maps are generated from the Pix4D software automatically. They can be used to assess various terrain parameters when used in conjugation. While 2D map is preferably suitable for assessing area and distances, the 3D map gives a more realistic visual perspective. The inclusion of depth in 3D maps makes it applicable to express altitude variations within the target site. When used together, it can help to monitor changes in forest canopy and accurately position markers for applications like forest disease management, estimation of forest degradation, etc. As few as two missions could help to obtain comprehensive understanding of the mapped area, as demonstrated in this article. Mapped area could be easily expanded by flying and connecting more missions together.

Constraints

There are several common technical issues of small UAVs such as short flight power duration, difficulties in maintaining the constant flight altitude, stability of the aircraft due to winds and turbulence and lack of GPS satellite connectivity in signal-shadowed regions leading to control loss. We were also taking into account the inaccessibility of the area due to the high density of trees, making sure to create the flight routes safe enough to mitigate the risk of possible hazards to the environment and people. Additionally, some countries require legalization and specific flight permission especially when the flight altitude and overall weight system increases (Cramer et al. 2013; Droneflight 2016). Flight missions were specially authorized by the government authorities of Korea.

One of the technological barriers for 3D map generating software at present is the quality with which it can reproduce the images. The algorithms it uses average out minute details like shapes of the leaves, sand and snow (Introduction Pix4D Webinar 1 2016). A 2D map, although it has no depth perception, can retain most of the details from the images. In combination, 2D and 3D maps make up for their individual shortcomings.

Conclusions

The use of UAVs for modern photogrammetry applications has flourished over the last decade with various platforms being used to address the forest inventories (Puliti et al. 2015) and to create the forest canopy height models (Lisein et al. 2013). In this particular research, we aimed to test and analyse the automatic capabilities of Phantom 2 Vision+ with the aid of image processing and photogrammetric software.

In order to become one of the most reliable methods of data analysis both in the field of biology and those of other sciences, the UAV must satisfy certain primary criteria in order to complete its job properly, and most importantly, to be better than the currently available methods. This includes the size of its sensor, its accuracy, weight and flight duration, all of which will either enable it to stand out from other research methods, or will slow it down until a better solution is found.

Its potential to adjust its properties in data acquisition times and the affordability of image extraction could possibly result in the UAV industry surpassing the traditional aerial techniques (Zhang et al. 2012). Furthermore, UAVs could be one of the flexible tools to respond to immediate environmental data collection requirements. When it comes to area inaccessibility, as was the case in our study site, the UAV is hardly a replaceable device for obtaining the nadir picture perspective.

The UAV’s potential as the next best tool in conservation biology also greatly depends on the interest of other researchers and their willingness to dedicate their time and effort into developing their own skills in using this tool for the benefit of all scientific research. Additionally, UAV developing companies also have a huge role to play if they would be willing to adapt their production plans to fit the various needs that other researchers require, instead of limiting UAVs to one particular type of flight usage. Understandably, this is something that will be developed as more interest rises in these flying machines.

In order to ensure that this message is delivered to a wider audience, mutual interaction and team work between researchers across the globe is needed so that true changes can be made to the conservation of some of the most valuable areas of our planet. The potential success of these developments has the certain ability to bring positive global changes and to enhance applications of technological advancements in ecology.

Abbreviations

UAV:

Unmanned aerial vehicle

UAVs:

Unmanned aerial vehicles

References

  • Cramer, M. (2013). The UAV@ LGL BW project—a NMCA case study (pp. 9–13). Stuttgart, Germany: Proceedings of 54th Photogrammetric Week.

    Google Scholar 

  • Dalponte, M., Ene, L. V., Marconcini, M., Gobakken, T., & Næsset, E. (2015). Semi-supervised SVM for individual tree crown species classification. ISPRS Journal of Photogrammetry and Remote Sensing, 110, 77–87.

    Article  Google Scholar 

  • Dji.com 2014. Phantom 2 Vision+. http://www.dji.com/product/phantom-2-vision-plus/feature.

  • Dji.com 2016. DJI—about us. http://www.dji.com/company.

  • Droneflight 2016. General Drone/UAV FAQ’s. http://shop.droneflight.co.uk/pages/general-drone-uav-faq-s.

  • English.visitkorea.or.kr 2016. Sobaeksan National Park (Gyeongbuk Area). http://english.visitkorea.or.kr/enu/SI/SI_EN_3_1_1_1.jsp?cid=264153.

  • Forance, K. M., Drakeley, C. J., William, T., Espino, F., & Cox, J. (2014). Mapping infectious disease landscapes: unmanned aerial vehicles and epidemiology. Trends in Parasitology, 30(11), 514–519.

    Article  Google Scholar 

  • Getzin, S., Wiegand, K., & Schöning, I. (2012). Assessing biodiversity in forests using very high-resolution images and unmanned aerial vehicles. Methods in Ecology and Evolution. British Ecological Society, 3, 397–404.

    Article  Google Scholar 

  • Govorčin, M., Pribićević, B., & Đapo, A. (2014). Comparison and analysis of software solutions for creation of digital terrain model using unmanned aerial vehicles. In Photogrammetry and Remote Sensing. 14th International Multidisciplinary Scientific GeoConference SGEM.

    Google Scholar 

  • Introduction Pix4D Webinar 1: Introduction to Modern Photogrammetry and Optimal Flight Plans 2016. https://www.youtube.com/watch?v=NGdZ8O2cWks. Accessed 25 Apr 2016.

  • Ivošević, B., Han, Y.-G., Cho, Y., & Kwon, O. (2015). The use of conservation drones in ecology and wildlife research. Ecology and Environment, 38(1), 113–188.

    Article  Google Scholar 

  • Koh LP. 2013. A drone’s-eye view of conservation. http://www.ted.com/speakers/lian_pin_koh.

  • Küng, O., Strecha, C., Beyeler, A., Zufferey, J. C., Floreano, D., Fua, P., & Gervaix, F. (2011). The accuracy of automatic photogrammetric techniques on ultra-light UAV imagery. UAV-g 2011-Unmanned Aerial Vehicle in Geomatics. No. EPFL-CONF-168806.

    Google Scholar 

  • Lisein, J., Pierrot-Deseilligny, M., Bonnet, S., & Lejeune, P. (2013). A photogrammetric workflow for the creation of a forest canopy height model from small unmanned aerial system imagery. Forests, 4, 922–944.

    Article  Google Scholar 

  • López-Fernández, L., Lagüela, S., Picón, I., & González-Aguilera, D. (2015). Large scale automatic analysis and classification of roof surfaces for the installation of solar panels using a multi-sensor aerial platform. Remote Sensing, 7(9), 11226–11248.

    Article  Google Scholar 

  • Paneque-Gálvez, J., McCall, M. C., Napoletano, B. M., Wich, S. A., & Koh, L. P. (2014). Small drones for community-based forest monitoring: an assessment of their feasibility and potential in tropical areas. Forests, 5(6), 1481–1507.

    Article  Google Scholar 

  • Pix4D 2016a. Pix4D—Drone Mapping Software. https://pix4d.com/.

  • Pix4D 2016b. Support—Pix4D. https://pix4d.com/support/.

  • Puerto, D. A., Gila, D. M. M., García, J. G., & Ortega, J. G. (2015). Sorting olive batches for the milling process using image processing. Sensors, 15(7), 15738–15754.

    Article  CAS  PubMed Central  Google Scholar 

  • Puliti, S., Ørka, H. O., Gobakken, T., & Næsset, E. (2015). Inventory of small forest areas using an unmanned aerial systems. Remote Sensing, 7(8), 9632–9654.

    Article  Google Scholar 

  • Vallet, J., Panissod, F., Strecha, C., & Tracol, M. (2011). Photogrammetric performance of an ultra light weight swinglet. “UAV”. UAV-g. No. EPFL-CONF-169252.

    Google Scholar 

  • Zhang, C., & Kovacs, J. M. (2012). The application of small unmanned aerial systems for precision agriculture: a review. Precision Agriculture, 13(6), 693–712.

    Article  CAS  Google Scholar 

Download references

Acknowledgements

This subject is supported by the Korea Ministry of Environment (MOE) as ‘Public Technology Program based on Environmental Policy (2014000210003)’.

Funding

This subject is supported by the Korea Ministry of Environment (MOE) as ‘Public Technology Program based on Environmental Policy (2014000210003)’.

Availability of data and materials

Not applicable.

Authors’ contributions

BI analysed and interpreted the image data of the coniferous tree and was a major contributor in writing the manuscript. YH mainly collected all the image data of coniferous tree with UAVs and processed the data using a 3D mapping software to analyse. OK mainly designed this work and revised the paper totally. All authors read and approved the final manuscript.

Competing interests

The authors declare that they have no competing interests.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ohseok Kwon.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ivosevic, B., Han, YG. & Kwon, O. Calculating coniferous tree coverage using unmanned aerial vehicle photogrammetry. j ecology environ 41, 10 (2017). https://doi.org/10.1186/s41610-017-0029-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41610-017-0029-0

Keywords