z-logo
Premium
Self‐tuning of fuzzy rules when learning data have a radically changing distribution
Author(s) -
Gonda Eikou,
Miyata Hitoshi,
Ohkita Masaaki
Publication year - 2003
Publication title -
electrical engineering in japan
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.136
H-Index - 28
eISSN - 1520-6416
pISSN - 0424-7760
DOI - 10.1002/eej.10168
Subject(s) - gradient descent , mathematics , fuzzy logic , artificial intelligence , simulated annealing , piecewise , method of steepest descent , algorithm , mathematical optimization , computer science , artificial neural network , mathematical analysis
In this paper, we propose a new type of membership functions (MSFs) and their efficient use to improve optimization of fuzzy reasoning using a steepest descent method. In self‐tuning of fuzzy rules using the steepest descent method, an algorithm to avoid suboptimal solutions by modifying learning coefficients has been proposed, where piecewise linear MSFs were introduced. In such an algorithm, when learning data have a radically changing distribution, it is impossible to avoid suboptimal solutions. To overcome this problem, we propose to apply double right triangular MSFs to the self‐tuning of fuzzy reasoning. By using these MSFs, radically changing grades can be represented easily. In addition, by a technique of simulated annealing (SA) we propose to move the peak positions of the MSFs according to the progress of learning so as to arrange the MSFs in certain positions where the learning data are changing radically. Compared with the algorithm using piecewise linear MSFs, this new algorithm makes it possible to avoid suboptimal solutions more effectively. The advantages of this new technique are demonstrated by numerical examples involving function approximations. © 2003 Wiley Periodicals, Inc. Electr Eng Jpn, 144(4): 63–74, 2003; Published online in Wiley InterScience ( www.interscience.wiley.com ). DOI 10.1002/eej.10168

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here