Getenesh Teshome
2020
Similarity and Farness Based Bidirectional Neural Co-Attention for Amharic Natural Language Inference
Abebawu Eshetu
|
Getenesh Teshome
|
Ribka Alemayehu
Proceedings of the Fourth Widening Natural Language Processing Workshop
In natural language one idea can be conveyed using different sentences; higher Natural Language Processing applications get difficulties in capturing meaning of ideas stated in different expressions. To solve this difficulty, different scholars have conducted Natural Language Inference (NLI) researches using methods from traditional discrete models with hard logic to an end-to-end neural network for different languages. In context of Amharic language, even though there are number of research efforts in higher NLP applications, still they have limitation on understanding idea expressed in different ways due to an absence of NLI in Amharic language. Accordingly, we proposed deep learning based Natural Language Inference using similarity and farness aware bidirectional attentive matching for Amharic texts. The experiment on limited Amharic NLI dataset prepared also shows promising result that can be used as baseline for subsequent works.
Bi-directional Answer-to-Answer Co-attention for Short Answer Grading using Deep Learning
Abebawu Eshetu
|
Getenesh Teshome
|
Ribka Alemahu
Proceedings of the Fourth Widening Natural Language Processing Workshop
So far different research works have been conducted to achieve short answer questions. Hence, due to the advancement of artificial intelligence and adaptability of deep learning models, we introduced a new model to score short answer subjective questions. Using bi-directional answer to answer co-attention, we have demonstrated the extent to which each words and sentences features of student answer detected by the model and shown prom-ising result on both Kaggle and Mohler’s dataset. The experiment on Amharic short an-swer dataset prepared for this research work also shows promising result that can be used as baseline for subsequent works.
Search