SubmissionNumber#=%=#64 FinalPaperTitle#=%=#Team Unibuc - NLP at SemEval-2024 Task 8: Transformer and Hybrid Deep Learning Based Models for Machine-Generated Text Detection ShortPaperTitle#=%=# NumberOfPages#=%=#9 CopyrightSigned#=%=#Teodor-George Marchitan JobTitle#==# Organization#==# Abstract#==#This paper describes the approach of the UniBuc - NLP team in tackling the SemEval 2024 Task 8: Multigenerator, Multidomain, and Multilingual Black-Box Machine-Generated Text Detection. We explored transformer-based and hybrid deep learning architectures. For subtask B, our transformer-based model achieved a strong second-place out of 77 teams with an accuracy of 86.95%, demonstrating the architecture's suitability for this task. However, our models showed overfitting in subtask A which could potentially be fixed with less fine-tunning and increasing maximum sequence length. For subtask C (token-level classification), our hybrid model overfit during training, hindering its ability to detect transitions between human and machine-generated text. Author{1}{Firstname}#=%=#Teodor-George Author{1}{Lastname}#=%=#Marchitan Author{1}{Username}#=%=#tmarchitan Author{1}{Email}#=%=#teodor.marchitan@s.unibuc.ro Author{1}{Affiliation}#=%=#University of Bucharest Author{2}{Firstname}#=%=#Claudiu Author{2}{Lastname}#=%=#Creanga Author{2}{Username}#=%=#claudiucreanga Author{2}{Email}#=%=#claudiu.creanga.backup@gmail.com Author{2}{Affiliation}#=%=#University of Bucharest Author{3}{Firstname}#=%=#Liviu P. Author{3}{Lastname}#=%=#Dinu Author{3}{Username}#=%=#liviu.p.dinu Author{3}{Email}#=%=#liviu.p.dinu@gmail.com Author{3}{Affiliation}#=%=#University of Bucharest ========== èéáğö