Kanda Runapongsa Saikaew


2020

pdf
A Neural Local Coherence Analysis Model for Clarity Text Scoring
Panitan Muangkammuen | Sheng Xu | Fumiyo Fukumoto | Kanda Runapongsa Saikaew | Jiyi Li
Proceedings of the 28th International Conference on Computational Linguistics

Local coherence relation between two phrases/sentences such as cause-effect and contrast gives a strong influence of whether a text is well-structured or not. This paper follows the assumption and presents a method for scoring text clarity by utilizing local coherence between adjacent sentences. We hypothesize that the contextual features of coherence relations learned by utilizing different data from the target training data are also possible to discriminate well-structured of the target text and thus help to score the text clarity. We propose a text clarity scoring method that utilizes local coherence analysis with an out-domain setting, i.e. the training data for the source and target tasks are different from each other. The method with language model pre-training BERT firstly trains the local coherence model as an auxiliary manner and then re-trains it together with clarity text scoring model. The experimental results by using the PeerRead benchmark dataset show the improvement compared with a single model, scoring text clarity model. Our source codes are available online.