Iadh Ounis


2021

pdf bib
RelDiff: Enriching Knowledge Graph Relation Representations for Sensitivity Classification
Hitarth Narvala | Graham McDonald | Iadh Ounis
Findings of the Association for Computational Linguistics: EMNLP 2021

The relationships that exist between entities can be a reliable indicator for classifying sensitive information, such as commercially sensitive information. For example, the relation person-IsDirectorOf-company can indicate whether an individual’s salary should be considered as sensitive personal information. Representations of such relations are often learned using a knowledge graph to produce embeddings for relation types, generalised across different entity-pairs. However, a relation type may or may not correspond to a sensitivity depending on the entities that participate to the relation. Therefore, generalised relation embeddings are typically insufficient for classifying sensitive information. In this work, we propose a novel method for representing entities and relations within a single embedding to better capture the relationship between the entities. Moreover, we show that our proposed entity-relation-entity embedding approach can significantly improve (McNemar’s test, p <0.05) the effectiveness of sensitivity classification, compared to classification approaches that leverage relation embedding approaches from the literature. (0.426 F1 vs 0.413 F1)

2020

pdf bib
Multi-Task Learning using Dynamic Task Weighting for Conversational Question Answering
Sarawoot Kongyoung | Craig Macdonald | Iadh Ounis
Proceedings of the 5th International Workshop on Search-Oriented Conversational AI (SCAI)

Conversational Question Answering (ConvQA) is a Conversational Search task in a simplified setting, where an answer must be extracted from a given passage. Neural language models, such as BERT, fine-tuned on large-scale ConvQA datasets such as CoQA and QuAC have been used to address this task. Recently, Multi-Task Learning (MTL) has emerged as a particularly interesting approach for developing ConvQA models, where the objective is to enhance the performance of a primary task by sharing the learned structure across several related auxiliary tasks. However, existing ConvQA models that leverage MTL have not investigated the dynamic adjustment of the relative importance of the different tasks during learning, nor the resulting impact on the performance of the learned models. In this paper, we first study the effectiveness and efficiency of dynamic MTL methods including Evolving Weighting, Uncertainty Weighting, and Loss-Balanced Task Weighting, compared to static MTL methods such as the uniform weighting of tasks. Furthermore, we propose a novel hybrid dynamic method combining Abridged Linear for the main task with a Loss-Balanced Task Weighting (LBTW) for the auxiliary tasks, so as to automatically fine-tune task weighting during learning, ensuring that each of the task’s weights is adjusted by the relative importance of the different tasks. We conduct experiments using QuAC, a large-scale ConvQA dataset. Our results demonstrate the effectiveness of our proposed method, which significantly outperforms both the single-task learning and static task weighting methods with improvements ranging from +2.72% to +3.20% in F1 scores. Finally, our findings show that the performance of using MTL in developing ConvQA model is sensitive to the correct selection of the auxiliary tasks as well as to an adequate balancing of the loss rates of these tasks during training by using LBTW.

2014

pdf bib
Real-Time Detection, Tracking, and Monitoring of Automatically Discovered Events in Social Media
Miles Osborne | Sean Moran | Richard McCreadie | Alexander Von Lunen | Martin Sykora | Elizabeth Cano | Neil Ireson | Craig Macdonald | Iadh Ounis | Yulan He | Tom Jackson | Fabio Ciravegna | Ann O’Brien
Proceedings of 52nd Annual Meeting of the Association for Computational Linguistics: System Demonstrations

2006

pdf bib
Examining the Content Load of Part of Speech Blocks for Information Retrieval
Christina Lioma | Iadh Ounis
Proceedings of the COLING/ACL 2006 Main Conference Poster Sessions

2005

pdf bib
Deploying Part-of-Speech Patterns to Enhance Statistical Phrase-Based Machine Translation Resources
Christina Lioma | Iadh Ounis
Proceedings of the ACL Workshop on Building and Using Parallel Texts