Genetic programming: on the programming of computers by means of natural selection
Genetic programming: on the programming of computers by means of natural selection
Constructing fuzzy models by product space clustering
Fuzzy model identification
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Fuzzy Modeling for Control
Pattern Recognition with Fuzzy Objective Function Algorithms
Pattern Recognition with Fuzzy Objective Function Algorithms
NEFCLASS-X — a Soft Computing Tool to Build Readable Fuzzy Classifiers
BT Technology Journal
The Primal-Dual Active Set Strategy as a Semismooth Newton Method
SIAM Journal on Optimization
Fuzzy Model Identification for Control
Fuzzy Model Identification for Control
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Extensions of vector quantization for incremental clustering
Pattern Recognition
Evolving fuzzy classifiers using different model architectures
Fuzzy Sets and Systems
An on-line interactive self-adaptive image classification framework
ICVS'08 Proceedings of the 6th international conference on Computer vision systems
A two-stage evolutionary process for designing TSK fuzzy rule-basedsystems
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Modified Gath-Geva fuzzy clustering for identification of Takagi-Sugeno fuzzy models
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Improving the interpretability of TSK fuzzy models by combining global learning and local learning
IEEE Transactions on Fuzzy Systems
Compact and transparent fuzzy models and classifiers through iterative complexity reduction
IEEE Transactions on Fuzzy Systems
FLEXFIS: A Robust Incremental Learning Approach for Evolving Takagi–Sugeno Fuzzy Models
IEEE Transactions on Fuzzy Systems
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
On employing fuzzy modeling algorithms for the valuation of residential premises
Information Sciences: an International Journal
A fast learning algorithm for evolving neo-fuzzy neuron
Applied Soft Computing
Information Sciences: an International Journal
Hi-index | 0.00 |
In this paper, we deal with a novel data-driven learning method [sparse fuzzy inference systems (SparseFIS)] for Takagi-Sugeno (T-S) fuzzy systems, extended by including rule weights. Our learning method consists of three phases: The first phase conducts a clustering process in the input/output feature space with iterative vector quantization and projects the obtained clusters onto 1-D axes to form the fuzzy sets (centers and widths) in the antecedent parts of the rules. Hereby, the number of clusters = rules is predefined and denotes a kind of upper bound on a reasonable granularity. The second phase optimizes the rule weights in the fuzzy systems with respect to least-squares error measure by applying a sparsity-constrained steepest descent-optimization procedure. Depending on the sparsity threshold, weights of many or a few rules can be forced toward 0, thereby, switching off (eliminating) some rules (rule selection). The third phase estimates the linear consequent parameters by a regularized sparsity-constrainedoptimization procedure for each rule separately (local learning approach). Sparsity constraints are applied in order to force linear parameters to be 0, triggering a feature-selection mechanism per rule. Global feature selection is achieved whenever the linear parameters of some features in each rule are (near) 0. The method is evaluated, which is based on high-dimensional data from industrial processes and based on benchmark datasets from the internet and compared with well-known batch-training methods, in terms of accuracy and complexity of the fuzzy systems.