Visual reconstruction
On the Liu---Floudas Convexification of Smooth Programs
Journal of Global Optimization
ISNN'12 Proceedings of the 9th international conference on Advances in Neural Networks - Volume Part I
ISNN'13 Proceedings of the 10th international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.01 |
The main results reported in this paper are two theorems concerning the use of a newtype of risk-averting error criterion for data fitting. The first states that the convexity region of the risk-averting error criterion expands monotonically as its risk-sensitivity index increases. The risk-averting error criterion is easily seen to converge to the mean squared error criterion as its risk-sensitivity index goes to zero. Therefore, the risk-averting error criterion can be used to convexify the mean squared error criterion to avoid local minima. The second main theorem shows that as the risk-sensitivity index increases to infinity, the risk-averting error criterion approaches the minimax error criterion, which is widely used for robustifying system controllers and filters.