In the panorama of the digitalisation of the built environment, the point cloud is often perceived as an objective and complete representation of reality: dense, precise and uncritical, it seems to offer all the information necessary to describe an existing work in its complexity. However, this perception is misleading.
From a technical point of view, it constitutes a purely geometric (morphometric) representation of the existing: a set of three-dimensional coordinates that describe surfaces and volumes with a variable level of detail, but without any attribution of meaning or spatial or functional relationships between the detected elements. Having a cloud of points available, although representing an advanced digital basis, is not sufficient either to describe reality in a complete and exhaustive way or to guarantee reliable knowledge of the work or to effectively support the subsequent modeling, analysis and management phases of the project.
This uncritical nature constitutes one of the main limits in ScanToBIM processes, in which geometric data must be transformed, through analysis, reading and interpretation processes, into information models consistent with the objectives and intended uses.
>> Are you interested in insights like this? Click here, it’s free
From precision to interpretation: the real crux of the process
Evaluating the quality of a survey exclusively through geometric parameters, such as the density of the point cloud, represents a reductive and misleading approach, as it favors the metric dimension alone, neglecting the most critical aspect of the process: the ability to interpret and transform the data into significant information.
A particularly dense dataset can suggest a high level of accuracy and completeness, but can at the same time hide significant critical issues, such as registration errors between scans, geometric distortions, presence of noise or gaps due to occlusions and operational limitations.
Even the Level of Accuracy (LOA), which measures the deviation between the survey and the acquired reality, although representing a more significant parameter, is not sufficient on its own to guarantee the overall quality of the point cloud.
Added to this is the issue of completeness of the survey: the point cloud never represents the totality of reality, since operating conditions, occlusions and access limits generate inevitable gaps. These are not anomalies, but intrinsic characteristics of the acquisition process that must be recognized, documented and managed. In the absence of conscious control of uncertainty, the risk is to fill such gaps with unverified information, compromising the overall reliability of the resulting ScanToBIM model.
Data reliability: between uncertainty and information stratification
The translation of a digital survey into a BIM environment involves selection, simplification and classification operations, taking the form of a decision-making process that introduces inevitable transformations of the original information. There is, therefore, no “neutral” transposition of reality: each model is the result of a mediation between what has been detected, what can be interpreted and the information objectives that guide its representation.
It is precisely in this phase, often underestimated, that the actual reliability and quality of a point cloud that can be integrated into BIM processes is determined. To facilitate this process, it may be useful to combine geometric data with additional information levels capable of expanding the representative capabilities of a point cloud.
A first level can be constituted by semantic classification, which allows a certain meaning to be attributed to a set of points (for example by associating construction categories, such as wall, attic, window, etc.). This operation can be further integrated to arrive at the definition of a real ontology, through the identification of relationships between hierarchical, functional or spatial parts of the cloud.
A further informative contribution can be derived from the integration of advanced instrumental data, often already available in the sensors but little used in operational processes. Information such as the return time of the laser signal, the intensity or the reflectance, for example, can provide useful information on the material characteristics of the surfaces, improving the ability to read and interpret the elements. The integration of these information levels allows us to move from a point cloud understood as a simple set of coordinates to an enriched database, capable of more effectively supporting the subsequent phases of interpretation and information modeling.
Quality is built in the Information Specifications
In light of what has been highlighted, the quality of the acquired data cannot be traced back to the survey phase alone, but must be governed throughout the entire transformation process from real to digital. In this perspective, the contracting authority takes on a central role, called upon to clearly and consciously define the right requirements for a survey that will have to be integrated within a ScanToBIM process.
The question is not how precise a point cloud should be, but how consistent, readable and reliable the resulting information is with respect to its objectives and intended uses. A correct definition of information needs allows you to guide the methodological and operational choices of a survey, guaranteeing truly effective and usable results over time, thus making a survey scalable for any subsequent uses.
From this operational perspective, stating that the point cloud is not sufficient in itself means recognizing that measurement does not coincide with knowledge and that the value of the survey does not lie in the quantity of data acquired, but in the ability to transform it into structured, verifiable information consistent with clear information requirements and established objectives.
Obviously the point cloud remains an indispensable tool in the process of digitizing buildings, but it represents only the starting point: it is through a conscious process of interpretation, structuring and verification, guided by well-defined requirements and processes, that the data becomes a real support for decision-making processes.
Thank you for subscribing to the newsletter.