Be the first to like this
In this research, we propose a MapReduce al- gorithm for creating contiguity-based spatial weights. This algorithm provides the ability to create spatial weights from very large spatial datasets efficiently by using computing re- sources that are organized in the Hadoop framework. It works in the paradigm of MapReduce: mappers are dis- tributed in computing clusters to find contiguous neighbors in parallel, then reducers collect the results and generate the weights matrix. To test the performance of this al- gorithm, we design experiment to create contiguity-based weights matrix from artificial spatial data with up to 190 million polygons using Amazon’s Hadoop framework called Elastic MapReduce. The experiment demonstrates the scal- ability of this parallel algorithm which utilizes large com- puting clusters to solve the problem of creating contiguity weights on Big data.