Open-ended QA isn't just about finding one answer—users need follow-up insights to refine their thinking. This work shows how to systematically generate those related insights from document collections to support iterative question-answering.
This paper introduces a new task where AI systems generate additional insights from documents to help users refine and improve answers to open-ended questions. The authors release SCOpE-QA, a dataset of 3,000 questions, and propose InsightGen, a method that clusters documents thematically and selects relevant context to generate diverse insights using language models.