Replacing KL-divergence-based information gain with integral probability metrics makes Bayesian experimental design more robust to model errors and better at handling rare events—critical for resource-constrained settings where data collection is expensive.
This paper improves how we choose which experiments to run when data collection is expensive. Instead of using traditional information-gain measures that struggle with rare events and mismatched probability distributions, the authors propose using integral probability metrics (like Wasserstein distance) to design better experiments.