Approximation capabilities of multilayer feedforward networks
Neural Networks
A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Smoothing noisy data by kriging with nugget effects
An international conference on curves and surfaces on Wavelets, images, and surface fitting
The nature of statistical learning theory
The nature of statistical learning theory
Shape quantization and recognition with randomized trees
Neural Computation
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Support vector machines for dynamic reconstruction of a chaotic system
Advances in kernel methods
Machine Learning
Improvements to Platt's SMO Algorithm for SVM Classifier Design
Neural Computation
Variable selection using random forests
Pattern Recognition Letters
Modern Applied Statistics with S
Modern Applied Statistics with S
Evaluation of likelihood functions for Gaussian signals
IEEE Transactions on Information Theory
Environmental Modelling & Software
A long-term sensitivity analysis of the denitrification and decomposition model
Environmental Modelling & Software
A hybrid analytical-heuristic method for calibrating land-use change models
Environmental Modelling & Software
Environmental Modelling & Software
Hi-index | 0.00 |
The environmental costs of intensive farming activities are often under-estimated or not traded by the market, even though they play an important role in addressing future society's needs. The estimation of nitrogen (N) dynamics is thus an important issue which demands detailed simulation based methods and their integrated use to correctly represent complex and non-linear interactions into cropping systems. To calculate the N"2O flux and N leaching from European arable lands, a modeling framework has been developed by linking the CAPRI agro-economic dataset with the DNDC-EUROPE bio-geo-chemical model. But, despite the great power of modern calculators, their use at continental scale is often too computationally costly. By comparing several statistical methods this paper aims to design a metamodel able to approximate the expensive code of the detailed modeling approach, devising the best compromise between estimation performance and simulation speed. We describe the use of two parametric (linear) models and six non-parametric approaches: two methods based on splines (ACOSSO and SDR), one method based on kriging (DACE), a neural networks method (multilayer perceptron, MLP), SVM and a bagging method (random forest, RF). This analysis shows that, as long as few data are available to train the model, splines approaches lead to best results, while when the size of training dataset increases, SVM and RF provide faster and more accurate solutions.