The physical properties (permeability, transport capacity) of porous media are strongly related to the spatial structure. Distance transform (DT) can describe the pore distribution and topological structure of porous media. In digital rock image, skeleton or medial axis, is a very important shape descriptor since it can preserve both the topological and geometrical information of the core sample and by using surface skeletons can remove noise from 3D shapes[1].Typical skeletonization approaches can be categorized into four types: algorithms based on general-field functions [2, 3], algorithms based on distance transformation [4, 5], thinning and boundary propagation [6, 7], and algorithms based on the Voronoi diagram [8, 9]. In addition, with the research of neural networks in recent years, although Multi-Scale Bidirectional Fully Convolution Network (MSB-FCN) can also capture and integrate multi-scale advanced context information for object skeleton detection [10], it still has some defects. The distance transformation is the process of calculating the distance between the space point and the target point, which is a powerful tool in digital image processing, computer vision, pattern recognition, shape analysis, and computational geometry [11]. Distance transforms are important preprocessing steps in complex image analysis systems. More precise control of the skeleton structure through distance transformation [12]. At present, it also has important applications in the field of digital core technology, which is used for the microscopic study on the oil/gas reservoirs. The application of digital core technology in the field of petrophysics is divided into two categories: direct analysis based on 3D digital images and numerical simulation of petrophysical properties based on 3D digital cores. The pore structure, mineral composition, grain size distribution and other information can be obtained through image analysis [13]. By using the digital core technology, the pore space of the reservoir can be finely characterized. Then, the flow and distribution of fluids in pore space can be simulated [14]. And the physical parameters, such as reservoir permeability, formation factor, and resistivity index, can be obtained. Skeletons provide a simple and compact representation of a 2D/3D shape that preserves many of the topological and size characteristics of the original. The process can be viewed as a transformation to transform the width of a binary pattern into just one single pixel. Essentially, such transformation can be achieved by successively removing points or layers of outline from a binary pattern until all the lines or curves are of unit width. The resulting set of lines or curves is called the skeleton of the pattern. As we know, the purpose of skeletonization is to reduce the amount of redundant data embedded in a binary image and to facilitate the extraction of distinctive features from the binary image thereafter.

The aim of the distance transformation is getting distance field with European, chessboard [15], and the spine of the distance field is taken as the skeleton. Due to the extraction of the spine is difficult, it is not easy to get an accurate spine. Generally, every point is specified by giving its distance to the nearest boundary point. The skeleton is defined as the set of points whose distance to the nearest boundary is locally maximum. Although some techniques have already been presented to realize Euclidean Distance Transformation (EDT), most of them are either inefficient or complex to implement. Furthermore, they require extra time and space demand such as special structure or storages for recording information, especially for the real digital cores which contains large amount of data. Cuisenaire and Macq [16] proposed a fast EDT by propagation using multiple neighborhoods, but a bucket sorting is required before calculating EDT. Datta and Soundaralakshmi [17] proposed a constant-time algorithm for achieving EDT, but their algorithm is based on a special hardware structure, the reconfigurable mesh. Saito and Toriwaki [18] proposed an algorithm for obtaining EDT in an n-dimensional domain, but the time-complexity is high in their algorithm. Wang and Tan [19] improved this algorithm and proposed an efficient EDT algorithm of a binary image in arbitrary dimensions, which reduced the time complexity effectively. But none of them solved the problem of too much traversal data and too much computation. Previously, the distance from the pore space to the solid particle boundary can be calculated by using algorithms such as random simulation, manual measurement, and parallel calculation, etc. These algorithms lead to data inaccuracy, long calculation periods, and high demand for hardware, etc. Therefore, a new algorithm that can perform distance transformation quickly and effectively is urgently needed. The Euclidean distance transformation (EDT) is realized by calculating the distance between each point and all background points in the image, and then comparing and filtering the minimum value. Thus, the EDT is complicated and time-consuming. For large-scale data, it is not realistic to use the EDT algorithm to obtain the Euclidean distance value quickly and accurately. Therefore, if there is no strict requirement on the calculation precision, the distance transformation is usually accomplished by approximate algorithms, such as Manhattan distance [20], Chebyshev distance [21], and chamfer distance [22]. To ensure the rotation invariance and accuracy of Euclidean distance, a lot of efforts have been made to reduce the complexity of Euclidean distance calculation. Based on the basic theories such as morphology, contour tracing, voronoi diagram, boundary stripped, and several Euclidean distance transform algorithms have been proposed [23–25]. However, most of these algorithms are only applicable for two-dimensional images, and inefficient. Shih et al. proposed a fast scanning algorithm for 2D images by using computational neighborhoods to transfer the relative address between pixels [26]. This algorithm improved the computing speed and gained widely recognition and application. Also, this algorithm provided some basis for 3D Euclidean distance transformation (CEDT).

When the amount of data reaches a certain level, the serial algorithm will cause the problem of the large increase of computing time and the serious shortage of computer memory. Therefore, parallel computing is more effective when the data volume is large. Manduhu and Jones proposed a fully-parallelized work-time optimal algorithm is presented for computing the exact Euclidean Distance Transform (EDT) of a 2D binary image with the size of n × n [27]. In addition, a new parallel algorithm is proposed for the computation of a Euclidean distance map and Voronoi diagram of a binary image that mixes CUDA multi-thread parallel image processing with a raster propagation of distance information over small fragments of the image [28]. But they did not consider the boundary value problem.

In this paper, a parallel computing algorithm based on CEDT algorithm is proposed (PCEDT). The PCEDT algorithm divides the large data and loads it into the processor sequentially, which increases the CPU utilization to the greatest extent and makes the distance transformation more efficient.