Syntax Literate: Jurnal Ilmiah
Indonesia p�ISSN: 2541-0849 e-ISSN: 2548-1398
Vol. 7, No. 5, Mei 2022
SIMULATING THE USE OF UAV
PHOTOGRAMMETRY FOR MONITORING LAND SUBSIDENCE AT URBAN AREA
Dimas
C Alamsyah
Geodesy Research Group, Faculty of Earth
Sciences and Technology, Institut Teknologi
Bandung, Jl. Ganesha 10, Bandung, Indonesia
Email: [email protected]
Abstract
The Unmanned Aerial
Vehicle (UAV) photogrammetry method has been widely used in various
applications. One of the potential photogrammetry methods is to estimate the
land deformation, e.g., land subsidence. This study aims to analyze the
opportunity of UAV photogrammetry utilization for detecting the land subsidence
urban areas, including relatively low cost and wide area coverage in a relatively
short time. Two photogrammetry measurements were conducted in two different
epochs with different flight altitudes to analyze the potential use of the
photogrammetry method. The Digital Surface Model (DSM) generated from two
different observation periods will be compared to obtain a land subsidence
model. Based on the result, the photogrammetry method can detect land
subsidence if a minimum subsidence value of 5 cm exists for an altitude flight
height of 80 meters, 10 cm for an altitude flight height of 100 meters, and 11
cm for an altitude flight height of 150 meters. We also highlight several
things to consider when applying the photogrammetry method to observe land
subsidence, e.g., remove the uncorrelated height values obtained from different
observations.
Keywords: UAV, DSM, land subsidence monitoring.
Introduction
Photogrammetry is a mapping method that is currently
widely used. The advantage of the photogrammetry method compared to other
methods such as Global Navigation Satellite System (GNSS) observations is its
ability to map large areas at a fairly low cost [1]. The platform generally used in the photogrammetry
method is the aircraft, but over time, it is starting to be replaced with the
Unmanned Aerial Vehicle (UAV) or what we know as drones. UAV photogrammetry can
be considered as the latest photogrammetry measurement tool [2].
The drone is then massively used and developed for
mapping needs, but the common camera used on the drone is a non-metric camera[3]. The non-metric camera is not specifically designed for
mapping purposes, so the resulting accuracy is may not as good as metric
cameras. The solution to this problem is calibrating the camera before data
acquisition (precalibrated). This is done to reduce systematic errors such as
the bowl effect that will affect Digital Elevation Model (DTM) [4].
The utilization of the UAV photogrammetry method does not
stop only for mapping needs[5]. One of the other potentials used of the UAV
photogrammetry method is to evaluate the surface changes, e.g., land
subsidence, by comparing two different digital surface models (DSMs) obtained
at different acquisition times[6]. Land subsidence itself explains the changes in the
geometric shape of objects horizontally and vertically from the initial
conditions, from the point of view of time [7].
Land subsidence is one of the problems that cause
potential disasters. One of the problems caused by land subsidence is flooding.
The cause of the subsidence is thought to be due to excessive groundwater
exploitation [8]. As
subsidence occurs in a relatively big area,�
several geodetic methods, such as leveling and GNSS observations, are
time-consuming[9]. So it is very important to assess the applicability of
the photogrammetric method for sensing land surface deformation, as it offers
time and cost-efficiency[10]. In this study, the UAV photogrammetry method will be
tested to analyze the method's applicability.
Figure 1
Research area
The research was simulated in a residential area with a
research area coverage as visualized in Figure 1. Particularly for this
simulation, we focus on an area covering 5 hectares that are currently carrying
out the dredging and stockpiling process for other residential development
needs. This location was deliberately chosen because it is considered suitable
for testing this method. After all, the surface changes that occur can be
observed quickly, and the changes obtained are quite significant, as evidenced
by the visualization of the area in Figure 2. Additionally, mapping the area in a wider area was
intended to place the Ground Control Point (GCP) and independent Check Point
(ICP) freely with a fairly even distribution.
Figure 2
Dredging area
visualization
METHODOLOGY
In the acquisition process, photogrammetry was conducted
at different flight heights and two different epochs[11]. The time interval of data collection for
the first (T1) and second (T2) data acquisitions is one month, where the
research area changed due to dredging. Details of the planning of the
acquisition process can be seen in Tabel 1.
Tabel 1
Data acquisition
parameters
Data Acquisition Parameters |
|
Mission Code |
Flying Altitude (m) |
Mission 1 (T1) |
80 |
Mission 2 (T1) |
100 |
Mission 3 (T1) |
150 |
Mission 4 (T2) |
80 |
Mission 5 (T2) |
100 |
Mission 6 (T2) |
150 |
The data acquisition process began with the installation
of GCPs and ICPs, then continued with acquiring coordinate values using GNSS
with the� Real-Time Kinematic (RTK)
method, then proceeded with the acquisition of aerial photos. The GCPs and ICPs
installation location is visualized in Figure 3.
From the data that has been acquired, data processing is
carried out. Data processing in the UAV photogrammetry processing starts from
the image matching process, then continues with the resection and intersection
process, only after that forms a DEM and proceeds to the orthomosaic map
formation stage. In this study, the Digital Surface Model (DSM), a derivative
product of DEM, is used to analyze altitude information, while orthomosaic map
was chosen to help visualize the changes that occur in areas experiencing land
subsidence.
Figure 3 �
Distribution
of GCPs and ICPs
The acquisition results at two epochs with different
flight altitude variations are processed to obtain DSMs.
The DSMs from the processing result are then paired according to
the flight variation. Then, we formed layer data of 2500 by 2500 points to form
a dense grid. We can see the point visualization in Figure 4.
Figure 4
Visualization
of gridded sample data
The point layer data for each pair of altitude height
variations are then filled in with the height value from the DSM of
each model formed from each epoch. The height data is compared and cleaned of
existing errors assisted using regression analysis and statistical
distributions. The height data that has been cleaned of the error value is then
set aside and used as data to form a land subsidence model using the kriging
interpolation model. The data processing flow can be seen in Figure 5.
Figure 5
Data
processing diagram
2.1 MEASUREMENT OF GROUND CONTROL POINT (GCP)
AND INDEPENDENT CHECK POINT (ICP)
Ground Control Points (GCPs) are required for
constraining data point clouds[12]. The results of the basic processing of aerial photos
can later be corrected for the coordinate values into the coordinate values of
the ground to represent the actual position. Independent Checkpoints (ICPs) are
intended to check the accuracy value of the formed model and tied to the ground
control point. The horizontal GNSS position accuracy value is denoted by Root
mean square Horizontal (RMSH), while vertical GNSS position accuracy is denoted
by Root mean square Vertical (RMSV)[13]. Tabel 2 and Tabel 3 show the coordinates for GCP and
ICP at two different epochs.
The coordinates of the GCPs and ICPs were estimated by
using the Hemisphere GNSS receiver. The RTK method was implemented and tied to
the Continuously Operating Reference Stations (CORS) belonging to the
Geospatial Information Agency with the codename of CLBG in the Lembang area,
with a baseline length about 10 to 11 kilometers to the measurement location.
The correction was given using the Networked Transport of RTCM method via
Internet Protocol (NTRIP).
Tabel 2
GCP and
ICP coordinates
one for epoch
GCP and ICP First Data
Acquisition (UTM Zone 48 S) |
|||||
Point |
Easting (m) |
Northing (m) |
Elevation (m) |
RMSEH (m) |
RMSEV (m) |
GCP-1 |
778664.733 |
9241680.998 |
765.493 |
0.006 |
0.014 |
GCP-2 |
778606.94 |
9241632.858 |
762.154 |
0.007 |
0.015 |
GCP-3 |
778605.236 |
9241564.463 |
761.172 |
0.01 |
0.02 |
GCP-4 |
778636.277 |
9241522.222 |
760.644 |
0.007 |
0.013 |
GCP-5 |
778596.592 |
9241494.368 |
760.899 |
0.006 |
0.011 |
GCP-6 |
778696.852 |
9241426.683 |
757.317 |
0.006 |
0.011 |
GCP-7 |
778708.783 |
9241404.64 |
756.095 |
0.006 |
0.01 |
GCP-8 |
778701.325 |
9241541.795 |
761.388 |
0.011 |
0.021 |
GCP-9 |
778656.456 |
9241538.098 |
761.43 |
0.006 |
0.012 |
GCP-10 |
778728.78 |
9241638.97 |
765.905 |
0.008 |
0.015 |
ICP-1 |
778713.62 |
9241581.853 |
762.969 |
0.009 |
0.019 |
ICP-2 |
778735.017 |
9241510.38 |
758.225 |
0.009 |
0.019 |
ICP-3 |
778712.944 |
9241406.955 |
756.073 |
0.009 |
0.019 |
ICP-4 |
778691.256 |
9241437.649 |
757.439 |
0.009 |
0.017 |
ICP-5 |
778709.021 |
9241428.455 |
757.354 |
0.009 |
0.017 |
ICP-6 |
778645.806 |
9241536.109 |
760.966 |
0.006 |
0.013 |
ICP-7 |
778586.748 |
9241498.234 |
759.656 |
0.009 |
0.018 |
ICP-8 |
778620.87 |
9241633.152 |
762.209 |
0.008 |
0.017 |
ICP-9 |
778667.406 |
9241663.038 |
764.515 |
0.008 |
0.017 |
ICP-10 |
778667.559 |
9241681.269 |
766.003 |
0.006 |
0.013 |
Tabel 3
GCP and ICP coordinates for epoch two
|
2.2 AERIAL PHOTO DATA PROCESSING
2.2.1 EXPORT DATA
The
aerial photoshoot area produced navigational coordinates and a JPG image
storage format[14].
The photoshoot of the study area includes all objects in the study area, such
as trees, roads, buildings, and other objects contained in the images obtained
from the shooting. The number of photos for each flying altitude is given in Tabel 4. The greater the altitude height, the fewer photos
that we obtain.
The
difference in the number of photos produced in the aerial photo data
acquisition process which caused by adjustments to other flight parameters such
as sidelap, overlap, and focal length[15].
Sidelap and overlap were closely related to the focal
length and flight altitude of the vehicle because the focal length and flight
altitude of the vehicle were parameters that regulate the sweep of the vehicle.
Moreover, sidelap and overlap were the parameters set
by the users[16].
So that, the number of photos in the acquisition process will certainly be
adjusted in such a way as to get the input parameters that have been planned before.
Tabel 4
Number
of Photos Based on Flying Height
Number
of Photos Based on Flying Height |
|
Flying Altitude |
Number of Photos |
80 |
114 |
100 |
82 |
150 |
42 |
2.2.2 DATA IMPORT AND
ELEVATION POINT ESTABLISHMENT (DENSE CLOUD)
The exported data was introduced to the Agisoft Metashape
software at this stage. After the data was introduced, the data will
automatically be processed based on the alignment of objects from 2 or more
photos to generate sparse cloud object points and paired according to identical
objects [17] as visualized in Figure 6.
Figure 6
The illustration of formation sparse cloud process
by aligning with 2 photos
The next process was to register the GCPs by matching the
coordinate value with a predetermined object (premark)[18], as shown in Figure 7, to get the corrected sparse
cloud.
Figure 7
Result of sparse cloud that has been corrected
Visualization of gridded sample data
The sparse corrected cloud then would be made into the
dense cloud, which becomes a point distribution with a height value visualized
by Figure 8. Those points then used to be formed of DSM.
Figure 8
Result of dense cloud formation
2.2.3 FORMATION OF THE
DIGITAL SURFACE MODEL (DSM)
The formation of the DSM is
an advanced process. In this research activity, DSM was
formed from elevation data in the form of points called dense clouds [19]. DSM is the basic data that will be analyzed in this study.
The DSM model used in this study consists of six models that can
be divided into three models at a time. One of the DTM models used in the
analysis process is visualized in Figure 9. The models are then analyzed
according to the flight altitude parameter at the acquisition time. DSM data
that has been grouped based on the data acquisition process was analyzed to
determine the changes that occur between the first and second time based on
altitude changes.
Figure 9
Results of the formation of DSM
2.3 Filtering Digital
Terrain Model (DTM)
In this process, the filtering process was carried out
from two DTMs with the same flight altitude, this process used sample data from
both DTMs with different measurement times[20]. This was done to eliminate large errors that exist in
the data caused by the movement of objects other than the ground and
differences in a light hue. Sampling was done by forming a dense grid of points
as visualized in Figure 4. The filtering process was assisted by a data trend
graph in the visualization Figure 10 which stated the distribution of data and
relative precision values between two models. The data trend graph was also
used to detect errors in the comparison process between the two models.
Error visualized with a point away from the line trend
formed from Figure 10, which eliminated iteratively based on the help of
statistical graphs to form a graph trend which can be seen visually in Figure
11. The error range, in this case, was distinguished based on a statistical
distribution divided into 256 ranges to classify errors that were accepted or
must be eliminated on objects other than soil.
Figure 10
Visualization of model comparison trend graphs, Flight height of 80
(a), Flight height of 100 (b), and Flight height of 150
(c) (X Axis : Model 1 Z value and Y Axis : Model 2 Z
Value)
Figure 11
Visualization of the marking (right) and
elimination process error based on statistical data (left)
After eliminating, re-checking the trend graph such as
the visualization in Figure 12 to confirm before proceeding to the next step,
further analysis could be carried out if the trend graph was not far from the
existing population value, as shown in Figure 12.
Figure 12
Trend chart visualization after filtering process,
Flight height of 80
(a), Flight height of 100 (b), and Flight height of 150
(c) (X Axis : Z value Model 1 and Y Axis : Z Value
Model 2)
At this stage,
the existing errors can be grouped based on visually visible data groups. The
errors can be grouped based on the sources. Meanwhile, the objects that contain
many errors are buildings, trees, and meadows. The error was caused by several
factors, for example in the error of the building object caused by the
difference in hue color between the first and second data collection and the
reflection of light that was too hot, causing the formation of an increased
height value on the object, then tree errors were identified due to the
movement of the position of the branch and twigs so that it makes a difference
of height was quite significant between two models. Besides that, errors in the
fields were also identified due to the movement of objects by wind gusts. In
addition, there were eliminated errors that can be tolerated. On the other
side, some mistakes usually contain the desired subsidence information. It
usually exists on the ground object or road object.
Result and Discussion
LAND SUBSIDENCE MODEL
The
subsidence model was formed based on sample points from the filtering stage. So
that the model formed was free from random errors, statistics, and blunders.
The formed model can be described in Figure 13-15.
The modeling was carried out using the
kriging method, which is a geostatistical method used to estimate the value of
a point or block as a linear combination of values around the point to be
estimated, while the weighting performed in the kriging method is the variance
of the minimum estimate by expanding the use of semivariogram.
Figure
13
Land Subsidence Model From Photogrammetry Method With Flight Height Of 80 Meters
Figure
14
Land
Subsidence Model From Photogrammetry Method With
Flight Height Of 100 Meters
Figure 15
Land Subsidence Model
From Photogrammetry Method With Flight Height Of 150 Meters
3.1 LAND SUBSIDENCE MODEL
VALIDATION
Validation of the land subsidence model was carried out
by measuring point samples using direct measurements at locations with a large
enough change value to facilitate identifying objects in the field. Data
sampling was carried out in the study area, which focused on the area that was
carrying out the dredging process in the first and second data collection time.
Figure 16
Sketch of sampling point
The validation was done by performing the extraction of
elevation values on the subsidence model at the validation point measured using
a meter. The validation value was obtained based on the calculation of the
standard deviation value from reducing the height value at the location, which
can be seen in Figure 16. The value of the standard deviation of the model from
this study can be seen in Tabel 5.
Tabel
4
Validation table for ground subsidence model in meters
Description |
Direct
Calculation |
Value 80 |
Value 100 |
Value 150 |
Difference _80 |
Difference 100 |
Difference �150 |
Sample 3 |
-1.283 |
-1.256 |
-1.307 |
-0.977 |
-0.027 |
0.024 |
-0.306 |
Sample 2 |
-0.943 |
-0.882 |
-0.955 |
-0.663 |
-0.061 |
0.012 |
-0.280 |
Sample 1 |
-0.625 |
-0.643 |
-0.814 |
-0.519 |
0.018 |
0.189 |
-0.106 |
Sample Deviation
Standards |
0.040 |
0.099 |
0.109 |
||||
Average |
-0.023 |
0.075 |
-0.231 |
Conclusion
The
UAV photogrammetry method was feasible to be used as a method for detecting
land subsidence if a minimum subsidence value of 5 cm exists for a flight
height of 80 meters, 10 cm for a flight height of 100 meters, and 11 cm for a
flight height of 150 meters.
Additionally, we highlight several
things to be considered when using UAV photogrammetry for subsidence monitoring
applications. The acquisition in UAV photogrammetry for the needs of land
subsidence monitoring analysis must use the same flight acquisition parameters,
including side lap by 60%, overlap by 80%, and lowest flight altitude taking
into account the proper conditions for flight. Then, error filtering was done
repeatedly and always paid attention to the error trend graph and standard deviation
value. So that not to eliminate the height value, which will be grouped into
land subsidence values. The filtering process endeavored was not to eliminate
the height value of the land object. The data modeling parameters use the same
parameters.
[1]����� F.
Franceschini, M. Galetto, D. Maisano, And L. Mastrogiacomo, �Large-Scale
Dimensional Metrology (Lsdm): From Tapes And Theodolites To Multi-Sensor
Systems,� Int. J. Precis. Eng. Manuf., Vol. 15, No. 8, Pp. 1739�1758,
2014.
[2]����� H.
Eisenbei�, �Uav Photogrammetry,� Eth Zurich, 2009.
[3]����� G.
Casagrande, �Opportunities,� In Small Flying Drones, Springer, 2018, Pp.
47�89.
[4]����� D.
Griffiths And H. Burningham, �Comparison Of Pre-And Self-Calibrated Camera
Calibration Models For Uas-Derived Nadir Imagery For A Sfm Application,� Prog.
Phys. Geogr. Earth Environ., Vol. 43, No. 2, Pp. 215�235, 2019.
[5]����� A.
Fritz, T. Kattenborn, And B. Koch, �Uav-Based Photogrammetric Point Clouds�Tree
Stem Mapping In Open Stands In Comparison To Terrestrial Laser Scanner Point
Clouds,� Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci, Vol. 40,
Pp. 141�146, 2013.
[6]����� D.
Gasperini, P. Allemand, C. Delacourt, And P. Grandjean, �Potential And
Limitation Of Uav For Monitoring Subsidence In Municipal Landfills,� Int. J.
Environ. Technol. Manag., Vol. 17, No. 1, Pp. 1�13, 2014.
[7]����� A.
H.-M. Ng Et Al., �Mapping Accumulated Mine Subsidence Using Small Stack
Of Sar Differential Interferograms In The Southern Coalfield Of New South
Wales, Australia,� Eng. Geol., Vol. 115, No. 1�2, Pp. 1�15, 2010.
[8]����� H.
Z. Abidin, H. Andreas, I. Gumilar, And Y. Fukuda, �E. Pohan, Ye, Deguchi,
T.(2011), Land Subsidence Of Jakarta (Indonesia) And Its Relation With Urban
Development,� Nat.Hazards.
[9]����� M.
Shirzaei, J. Freymueller, T. E. T�rnqvist, D. L. Galloway, T. Dura, And P. S.
J. Minderhoud, �Measuring, Modelling And Projecting Coastal Land Subsidence,� Nat.
Rev. Earth \& Environ., Vol. 2, No. 1, Pp. 40�58, 2021.
[10]��� N.
An Et Al., �Plant High-Throughput Phenotyping Using Photogrammetry And
Imaging Techniques To Measure Leaf Length And Rosette Area,� Comput. Electron.
Agric., Vol. 127, Pp. 376�394, 2016.
[11]��� M.
Fabris And A. Pesci, �Automated Dem Extraction In Digital Aerial
Photogrammetry: Precisions And Validation For Mass Movement Monitoring,� Ann.
Geophys., Vol. 48, No. 6, 2005.
[12]��� V.-E.
Oniga, A.-I. Breaban, And F. Statescu, �Determining The Optimum Number Of
Ground Control Points For Obtaining High Precision Results Based On Uas
Images,� In Multidisciplinary Digital Publishing Institute Proceedings,
2018, Vol. 2, No. 7, P. 352.
[13]��� O.
Eroglu, �Information Retrieval From Spaceborne Gnss Reflectometry Observations
Using Physics-And Learning-Based Techniques,� Electrical And Computer
Engineering, 1986.
[14]��� G.
J. J. Verhoeven, �It�s All About The Format--Unleashing The Power Of Raw Aerial
Photography,� Int. J. Remote Sens., Vol. 31, No. 8, Pp. 2009�2042, 2010.
[15]��� K.
N. Tahar, �An Evaluation On Different Number Of Ground Control Points In
Unmanned Aerial Vehicle Photogrammetric Block,� Int. Arch. Photogramm.
Remote Sens. Spat. Inf. Sci, Vol. 40, Pp. 93�98, 2013.
[16]��� M.
V. Y. Garcia And H. C. De Oliveira, �The Influence Of Flight Configuration,
Camera Calibration, And Ground Control Points For Digital Terrain Model And
Orthomosaic Generation Using Unmanned Aerial Vehicles Imagery,� Bol.
Ci�ncias Geod�sicas, Vol. 27, 2021.
[17]��� B.
Nagy, L. Kov�cs, And C. Benedek, �Sfm And Semantic Information Based Online
Targetless Camera-Lidar Self-Calibration,� In 2019 Ieee International
Conference On Image Processing (Icip), 2019, Pp. 1317�1321.
[18]��� D.
Suwardhi Et Al., �3d Surveying, Modeling And Geo-Information System Of
The New Campus Of Itb-Indonesia,� Int. Arch. Photogramm. Remote Sens. Spat.
Inf. Sci., Vol. 42, P. 97, 2016.
[19]��� E.
Husson, H. Reese, And F. Ecke, �Combining Spectral Data And A Dsm From
Uas-Images For Improved Classification Of Non-Submerged Aquatic Vegetation,� Remote
Sens., Vol. 9, No. 3, P. 247, 2017.
[20] B.
Petzold, P. Reiss, And W. St�ssel, �Laser Scanning�Surveying And Mapping
Agencies Are Using A New Technique For The Derivation Of Digital Terrain
Models,� Isprs J. Photogramm. Remote Sens., Vol. 54, No. 2�3, Pp.
95�104, 1999.
������������������������������������������������
Copyright holder: Dimas C Alamsyah (2022) |
First publication right: Syntax Literate: Jurnal Ilmiah
Indonesia |
This article is licensed
under: |