.. _reporting: ======================= Reporting ======================= Validation results can be visualized in an HTML report using the function :meth:`wind_validation.reporting.create_report`. Currently, there are three types of reports one can generate: 1. **Aggregated report** The aggregated report takes in one or many points that result from the :meth:`wind_validation.validation.validate` method. The results are shown in your browser by default so you can inspect the results. It shows mean error values, histograms or distributions of error metrics for point to point comparison between mode and observations. 2. **Comparison report** Alternatively, the comparison report shows a similar report but it enables you to input a list of multiple validation results and in turn see side by side comparison on how different models perform on the same set of sensor data points. That can help to interpret which model is performing better. The type of a report is selected by type of the input - if it's a single validation object the function tries to make an aggregated report and if it's a list it will produce a comparison report. For more information of types, please see the documentation below. 3. **Detailed report** This report is generated when you pass in a dataset with a single point. All the metrics and statistics that were used to generate the validation are reported. The map is automatically zoomed in to the site. This can be useful if you want to do a detailed investigation in a single site, for example if it shows large errors. .. autofunction:: wind_validation.reporting.create_report