The development of robust and reliable systems requests the use of techniques able to implement the day-by-day problems. In the implementation of systems that use the approximate reasoning, the fuzzy technique has been used with success. However, there are situations where this technique doesn't satisfy totally the characteristics of the problem. So, trying to improve the results alternatives, have been searched, the use of the interval theory to fuzzy systems trying to minimize the implementation errors of the values supplied by the specialist. In this work is described a way of solving the specialist's mistake problem in the approximate reasoning representation, proposing for that an interval min-max fuzzy inference.
Clique no link abaixo para buscar o texto completo deste trabalho na Web:Buscar na Web Processando consulta.