Notes

Prior work repeatedly pointed out that a prerequisite for explanations to be truly human-like is to be interactive, because explanation is a grounding process where people incrementally close the belief gaps.

Interactive explanations : One Explanation Does Not Fit All: A Toolkit and Taxonomy of AI Explainability Techniques

Interactive explanation allowing users to drill down or ask for different types of explanation until satisfied with their understanding

Bridging the Gap Between Ethics and Practice: Guidelines for Reliable, Safe, and Trustworthy Human-centered AI Systems

exploration works best when the user inputs are actionable, that is, users have control and can change the inputs. Alternatives are needed when users do not have control over the inputs values or when …

These diverse concerns mean that drawing researchers and practitioners from diverse disciplines in more likely to lead to success.

support of technical discussions with data scientists and stakeholders, as mitigating the friction is critical for the success of their advocacy for explainability. An opportunity for sensitizing support is to create concrete mapping between user questions and algorithmic capabilities.

it not feasible for human decision making either since many intuition and tacit knowledge cannot be externalized and explained.

No way of extracting, structuring ,presenting information can fully externalized and explained human’s intuition and tacit knowledge.

therefore, the proper stand is that explanations are only expected to be a simplified approximations of complex decision making functions. it is enough if the users’ need is fulfilled.