Md. Shohanur Islam Sobuj


Fixing paper assignments

  1. Please select all papers that do not belong to this person.
  2. Indicate below which author they should be assigned to.
Provide a valid ORCID iD here. This will be used to match future papers to this author.
Provide the name of the school or the university where the author has received or will receive their highest degree (e.g., Ph.D. institution for researchers, or current affiliation for students). This will be used to form the new author page ID, if needed.

TODO: "submit" and "cancel" buttons here


2025

pdf bib
LLM-Mixer: Multiscale Mixing in LLMs for Time Series Forecasting
Md Kowsher | Md. Shohanur Islam Sobuj | Nusrat Jahan Prottasha | E. Alejandro Alanis | Ozlem Garibay | Niloofar Yousefi
Proceedings of the 4th Table Representation Learning Workshop

Time series forecasting is a challenging task, especially when dealing with data that contains both short-term variations and long-term trends. In this study, we introduce LLM-Mixer, a novel framework that combines multiscale time-series decomposition with the power of pre-trained Large Language Models (LLMs). LLM-Mixer breaks down time-series data into multiple temporal resolutions using downsampling and processes these multiscale representations with a frozen LLM, guided by a carefully designed text prompt that encodes information about the dataset’s features and structure. To understand the role of downsampling, we conduct a detailed analysis using Neural Tangent Kernel (NTK) distance, showing that incorporating multiple scales improves the model’s learning dynamics.We evaluate LLM-Mixer across a diverse set of forecasting tasks, including long-term multivariate, short-term multivariate, and long-term univariate scenarios. Experimental results demonstrate that LLM-Mixer achieves competitive performance compared to recent state-of-the-art models across various forecasting horizons.

2023

pdf bib
Contrastive Learning for Universal Zero-Shot NLI with Cross-Lingual Sentence Embeddings
Md Kowsher | Md. Shohanur Islam Sobuj | Nusrat Jahan Prottasha | Mohammad Shamsul Arefin | Yasuhiko Morimoto
Proceedings of the 3rd Workshop on Multi-lingual Representation Learning (MRL)