3.2.1 Model verification, validation, uncertainty analysis and calibration
Model verification, validation, uncertainty analysis and calibration of the virtual model should perform following the methodology outlined in recent literature [12][13][14][15][16]. Uncertainty in computing can have different sources:
1.     model inputs (variables and parameters);
2.     model form (assumptions, abstraction, etc.);
3.     numerical approximations (discretization, iteration and round-off errors).
 
In model validation for comparison of simulation data with respect to measured data, the classification of input model and experimental uncertainty should be focused which are classified into two main categories:
1.     aleatory uncertainty (random process, requiring the use of a probability distribution function);
2.     epistemic uncertainty (lack of knowledge, requiring the use of interval data).
Experimental measured data which are attributed to an aleatory uncertainty is in the form of a random Gaussian error resulting from data collection by different instruments. For example,U-value might be one of the quantities affected by aleatory uncertainty. Uncertainty propagates through the model, from input variables (dynamically evolving quantities) and parameters (fixed quantities), to model outputs.
Model input parameters affected by epistemic uncertainty (defined by interval data) may be summarized in table such as the following:
Table 3.1 - Model inputs parameters
 
•      Input parameters affected by epistemic uncertainty
Measure Unit
Input values
•       
 
 
 
 
•       
 
 
 
 
•       
 
 
 
 
•       
 
 
 
 
Note:The model output variables (validation metrics) in terms of energy demand or hygrothermal properties, has to be considered. 
 
Uncertainty propagation result in model is obtained by multiple simulation and compared with experimental measures, with the related uncertainty.
The whole process of verification, validation, uncertainty analysis and calibration is performed through the following essential steps:
1.     basic model verification, following data and assumption (Table 3.1);
2.     classification of the different sources of uncertainty (modelling or experimental) and definition of their mathematical structure (i.e. probability distribution functions or intervals of data);
3.     simulation of the propagation of uncertainty in modelling with experimental input data;
4.     comparison of simulation outcomes with measured data;
5.     residual analysis;
6.     model calibration.
It is important to note that step 4 need to be considered both:
·       measurement uncertainty (aleatory) for experimental data, e.g. the accuracy of the instruments;
·       uncertainty propagation in model (combined aleatory and epistemic) for simulated data.
Finally, residual analysis should be performed to carry out the model calibration.
 
3.2.2 Residual analysis
Then, residuals calculated for comparison between the predicted value and the actual value is an estimate of the average error (or deviation) about the regression line. Tests applied to these data can yield information on which program inputs are responsible for the residuals and so give an indication of which physical processes are not being adequately represented by the program.
Residuals have to be compared to a U or gaussian distributions fitted to them, highlighting the substantial symmetry of the distributions for the validation metrics. In any case, the residual distribution indicates visible correlation, statistically significance and how the model predicts values higher and lower than the actual ones with equal probability.

3.2.3 Model calibration

Finally, the virtual model have to be calibrated on the monthly real data, considering the values of the model input parameters (initially unknown, but assumed as epistemic uncertainties) which guarantee the best fit with respect to fundamental quantities chosen as indicators of model calibration. According to literature review, the main metric to consider are:
·       determination coefficient (R2);
·       the coefficient of variation of Root Mean Square Error (RMSE) (CV(RMSE));
·       and the Normalized Mean Bias Error (NMBE).
The mathematical formulations of the three metrics used for calibration are reported: