|
Mitter, H., Schönhart, M., & Schmid, E. (2014). Integrated climate change impact and adaptation assessment for the agricultural sector in Austria..
|
|
|
Mitter, H., Heumesser, C., & E., S. (2014). Modelling robust crop production protfolios to assess agricultural vulnerability to climate change..
|
|
|
Mitter, H., Schönhart, M., Meyer, I., Mechtler, K., Schmid, E., Sinabell, F., et al. (2015). Agriculture. In K. Steiniger, & M. König (Eds.),. Cost of Inaction in Austria. Vienna: Springer.
|
|
|
Mitter, H., Schmid, E., & Sinabell, F. (2015). Climate change and policy impacts on Austrian protein crop supply balances. (Vol. 2015).
|
|
|
Wallach, D., Thorburn, P., Asseng, S., Challinor, A. J., Ewert, F., Jones, J. W., et al. (2016). Estimating model prediction error: Should you treat predictions as fixed or random. Env. Model. Softw., 84, 529–539.
Abstract: Crop models are important tools for impact assessment of climate change, as well as for exploring management options under current climate. It is essential to evaluate the uncertainty associated with predictions of these models. We compare two criteria of prediction error; MSEPfixed, which evaluates mean squared error of prediction for a model with fixed structure, parameters and inputs, and MSEPuncertain(X), which evaluates mean squared error averaged over the distributions of model structure, inputs and parameters. Comparison of model outputs with data can be used to estimate the former. The latter has a squared bias term, which can be estimated using hindcasts, and a model variance term, which can be estimated from a simulation experiment. The separate contributions to MSEPuncertain(X) can be estimated using a random effects ANOVA. It is argued that MSEPuncertain(X) is the more informative uncertainty criterion, because it is specific to each prediction situation.
|
|