Metrics of Ensemble Forecast Skill (hydrostats.ens_metrics)

hydrostats.ens_metrics Module

The ens_metrics module contains all of the metrics included in hydrostats that measure forecast skill. Each forecast metric is contained in a function, and every metric has the ability to treat missing values as well as remove zero and negative values from the timeseries data. Users will be warned which start dates have been removed in the warnings that display during the function execution.

Functions

ens_me(obs[, fcst_ens, remove_zero, …]) Calculate the mean error between observed values and the ensemble mean.
ens_mae(obs[, fcst_ens, remove_zero, …]) Calculate the mean absolute error between observed values and the ensemble mean.
ens_mse(obs[, fcst_ens, remove_zero, …]) Calculate the mean squared error between observed values and the ensemble mean.
ens_rmse(obs[, fcst_ens, remove_zero, …]) Calculate the root mean squared error between observed values and the ensemble mean.
ens_pearson_r(obs, fcst_ens[, remove_neg, …]) Calculate the pearson correlation coefficient between observed values and the ensemble mean.
crps_hersbach(obs, fcst_ens[, remove_neg, …]) Calculate the the continuous ranked probability score (CRPS) as per equation 25-27 in Hersbach et al.
crps_kernel(obs, fcst_ens[, remove_neg, …]) Compute the kernel representation of the continuous ranked probability score (CRPS).
ens_crps(obs, fcst_ens[, adj, remove_neg, …]) Calculate the ensemble-adjusted Continuous Ranked Probability Score (CRPS)
ens_brier([fcst_ens, obs, threshold, …]) Calculate the ensemble-adjusted Brier Score.
auroc([fcst_ens, obs, threshold, …]) Calculates Area Under the Relative Operating Characteristic curve (AUROC) for a forecast and its verifying binary observation, and estimates the variance of the AUROC
skill_score(scores, bench_scores, perf_score) Calculate the skill score of the given function.