Menglin Jia
Fixing paper assignments
- Please select all papers that belong to the same person.
- Indicate below which author they should be assigned to.
TODO: "submit" and "cancel" buttons here
2021
When in Doubt: Improving Classification Performance with Alternating Normalization
Menglin Jia
|
Austin Reiter
|
Ser-Nam Lim
|
Yoav Artzi
|
Claire Cardie
Findings of the Association for Computational Linguistics: EMNLP 2021
We introduce Classification with Alternating Normalization (CAN), a non-parametric post-processing step for classification. CAN improves classification accuracy for challenging examples by re-adjusting their predicted class probability distribution using the predicted class distributions of high-confidence validation examples. CAN is easily applicable to any probabilistic classifier, with minimal computation overhead. We analyze the properties of CAN using simulated experiments, and empirically demonstrate its effectiveness across a diverse set of classification tasks.