**5. Uniqueness of cave investigation**

model, one must not even have to visualize the model to obtain the results, which are num-

Creating realistic models—however—is a more common aim among cavers. Besides the table-, and the map view, the cave surveying programs provide 3D visual representation of survey results for a long time (helping the cavers to understand the passage structure more easily). In the popular surveying programs, the modeling is also based on the extrusion of a geometrical object along the station-target vector, but to enhance the model resolution the vertex number is increased in the surveys regarding the transversal sections. Instead of just 4 (LRUD method), 6 or 12 equally distributed radial vectors are measured around a station perpendicular to the station-target vector [7]. The sections are placed along the polyline network, and the more the vertices are measured, the better the realistic model will be. The edges of the adjoining shapes

bers indicating the volume, surface, and rate of void in the incorporating rock.

are also smoothed automatically using tangential curves and radial base functions.

**4.4. Quality control**

40 Cave Investigation

errors distributed over the whole area.

When working with a LiDAR data, the automatic methods help fitting the point clouds of different stations to each other. The fitting result is described with statistical parameters and in some cases with a new attribute of each of the points showing the quality (*quality map*).

Automatic processes in data management are responsible mainly for the data loading and updating. This automatism occurs when the database is located on a server, while the GIS interface is on a client computer. If a working GIS is established and the database connections are defined, SQL scripts can update the client side regularly querying the database server. The data upload process is also automatized in this case: the data logged in a survey management program (or just in an Excel sheet) is written in a certain file format, which can be data-mined with scripts (primitive programs developed for repeatedly occurring tasks). The script code can work with any type of data (raster, vector, or alphanumeric). It extracts the data from the structured file and uploads it to the server-side database. The data mining scripts only work well if the files are located on the predefined path/folder; otherwise, the data is not loaded in the main database.

One of the most crucial issues of archive data processing is the estimation of errors, which are present in the sources. Errors mostly affect the spatial positioning of the base data, thus, it is important to find ways to compare the existing data to something we can surely decide whether it can be trusted. It is also important to know how the errors were originally put into

For example, if loop correction was done with survey management software, the geographic position of the passages might have changed drastically. On the printed editions of the map, this was not always fully tracked. The farther in the past we go back, the bigger is the chance we find cave maps with uncorrected parts edited manually after new survey sequences. In fact, most of the result maps of archive surveys inevitably bear such kind of inhomogeneous

This produces many possibilities for subsequent misinterpretation and first of all, we have to obtain a consistent database of the archive survey tracks to calculate loop closures. Though, this

the data. The QC of the archive data is usually based on new (control) measurements.

Caves are unique systems which evolved in unique geological, morphological, and climatic circumstances. Although following the processing sequences and using automatism may help one to process and analyze the data quickly, people have to keep it in their minds that the certain cave may be different from the previously processed one. The uniqueness on one hand comes from the differing aims, but it also originates from the specialities of the surveillance techniques. Concerning the surveying part of the investigation, the position of the cave relative to the groundwater level is the most influencing factor followed by the passage geometry. The surveying can be extremely difficult in subaquatic caves, where the water is muddy and easy in dry and comfortably wide passages with box-shaped transversal sections.

During the last decade, the TLS surveying technology has evolved to a level of flexibility that makes it adaptable to suit different geometric conditions in caves [24]. There also exist several well-documented projects, which provide the necessary stepwise help in combining photo documentation with TLS survey data. The uniqueness of the geological settings and the hydrogeological history of the area however, is still an issue in interpreting the data. Thus, purely mathematical approaches extrapolating throughout whole regions must be handled with caution because they may lead to false results. It was shown by several authors that on regional scale, the regularities of the cave distribution in the karst depends on variables like the thickness and dipping of beds [25], presence/abundance of tectonic fractures [26], the rate and direction of vertical movements [27], and the hydrological settings (hypogenic/epigenic conditions [28]).
