Project

Don't Just Tell Me, Ask Me

Copyright

Fluid Interfaces

Fluid Interfaces

Critical thinking is an essential human skill. Despite the importance of critical thinking, research reveals that our reasoning ability suffers from personal biases and cognitive resource limitations, leading to potentially dangerous outcomes. This project presents the novel idea of AI-supported self-explanations that reframe information as questions to actively engage people's thinking and scaffold their reasoning process. We conducted a study with 210 participants comparing the effects of AI-supported self-explanations on users' ability to discern logically validity of statements in comparison with an AI that provides recommendations with typical causal AI explanations as well as a no-feedback condition. Our results show that AI-supported self-explanations significantly increase human discernment accuracy of logically flawed statements over these other conditions as well as users' desire to verify information with additional sources. Our experiment exemplifies a future style of Human-AI co-reasoning system, where the AI becomes a critical thinking stimulator rather than an information teller.

Copyright

Fluid Interfaces

Copyright

Fluid Interfaces

The interface for showing feedback to study participants.

Copyright

Fluid Interfaces

We found that AI-supported self-explanations increased people’s accuracy for logically invalid statements and boosted their need for information before making up their minds.