Empirical model-building and response surface
Empirical model-building and response surface
Support vector domain description
Pattern Recognition Letters - Special issue on pattern recognition in practice VI
Orthogonal Transforms for Digital Signal Processing
Orthogonal Transforms for Digital Signal Processing
Hi-index | 0.00 |
Decomposition-based optimization strategies decouple a system design problem and introduce coupling variables as decision variables that manage communication among subproblems. The computational cost of such approaches is comparable to that of the equivalent, yet usually unsuccessful, attempts to solve the coupled system directly when the coupling variables consist of a small, finite number of scalars. When the coupling variables are infinite-dimensional quantities, such as functional data, implementing decomposition-based optimization strategies may become computationally challenging. Discretization is typically applied, transforming infinite-dimensional variables into finite-dimensional ones represented as vectors. A large number of discretized points is often necessary to ensure a sufficiently accurate representation of the functional data, and so the dimensionality of these vector-valued coupling variables (VVCVs) can become prohibitively large for decomposition-based design optimization. Therefore, it is desirable to approximate the VVCVs with a reduced dimension representation that improves optimization efficiency while preserving sufficient accuracy. We investigate two VVCV representation techniques, radial-basis function artificial neural networks and proper orthogonal decomposition, and implement each in an analytical target cascading problem formulation for electric vehicle powertrain system optimization. Specifically, both techniques are applied to VVCVs associated with motor boundary torque curves and power loss maps and are assessed in terms of dimensionality reduction, computational expense, and accuracy.