z-logo
Premium
A proposal to minimize the cost of processing big geospatial data in public cloud providers
Author(s) -
Bachiega João,
Holanda Maristela,
Araujo Aleteia P. F.
Publication year - 2021
Publication title -
transactions in gis
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.721
H-Index - 63
eISSN - 1467-9671
pISSN - 1361-1682
DOI - 10.1111/tgis.12754
Subject(s) - geospatial analysis , cloud computing , big data , dynamism , computer science , data science , process (computing) , volume (thermodynamics) , set (abstract data type) , data set , data processing , database , data mining , geography , artificial intelligence , cartography , operating system , physics , quantum mechanics , programming language
Spatial data represent abstractions of real‐world entities and can be obtained in various ways. They have properties that differentiate them from other types of data, such as a complex structure and dynamism. In recent years with the increasing volume of spatial data, referred to as “big geospatial data,” some tools have been developed to process such data efficiently, such as SpatialHadoop. The use of appropriate data indices based on the data set to be processed, as well as queries and operations to be performed, is essential for the optimal performance of these applications. In particular, since public cloud providers’ charges are based on the resources used, it is imperative to optimize application execution in order to avoid unnecessary expense. This article proposes the use of six conditions that seek to minimize the cost of processing big geospatial data in public cloud providers. The tests performed demonstrate that the use of these conditions and the choice of the lowest‐cost provider can reduce the total processing cost by up to 41%.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here