@inproceedings{hosseini-caragea-2021-distilling-knowledge,
    title = "Distilling Knowledge for Empathy Detection",
    author = "Hosseini, Mahshid  and
      Caragea, Cornelia",
    editor = "Moens, Marie-Francine  and
      Huang, Xuanjing  and
      Specia, Lucia  and
      Yih, Scott Wen-tau",
    booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2021",
    month = nov,
    year = "2021",
    address = "Punta Cana, Dominican Republic",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2021.findings-emnlp.314/",
    doi = "10.18653/v1/2021.findings-emnlp.314",
    pages = "3713--3724",
    abstract = "Empathy is the link between self and others. Detecting and understanding empathy is a key element for improving human-machine interaction. However, annotating data for detecting empathy at a large scale is a challenging task. This paper employs multi-task training with knowledge distillation to incorporate knowledge from available resources (emotion and sentiment) to detect empathy from the natural language in different domains. This approach yields better results on an existing news-related empathy dataset compared to strong baselines. In addition, we build a new dataset for empathy prediction with fine-grained empathy direction, seeking or providing empathy, from Twitter. We release our dataset for research purposes."
}Markdown (Informal)
[Distilling Knowledge for Empathy Detection](https://preview.aclanthology.org/ingest-emnlp/2021.findings-emnlp.314/) (Hosseini & Caragea, Findings 2021)
ACL
- Mahshid Hosseini and Cornelia Caragea. 2021. Distilling Knowledge for Empathy Detection. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 3713–3724, Punta Cana, Dominican Republic. Association for Computational Linguistics.