Local explanations of information aggregation with fuzzy linguistic rules
Explainable Artificial Intelligence (XAI) is relatively new in offering the possibility for intelligent systems to give solid motivations for their decisions and behaviors. Since XAI is human-centered it has strong connections with fuzzy systems. In this paper we consider the local explanation of the output of a set of fuzzy linguistic rules from the contributions of the inputs to the output. The contributions are defined from the average of the gradients on the straight line linking the start point to the end point. This approach ensures that the variation of the output is equal to the sum of the contributions of each input variable to the output.