Zhiwei Fang
2025
Rethinking Cross-Subject Data Splitting for Brain-to-Text Decoding
Congchi Yin
|
Qian Yu
|
Zhiwei Fang
|
Changping Peng
|
Piji Li
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Recent major milestones have successfully reconstructed natural language from non-invasive brain signals (e.g. functional Magnetic Resonance Imaging (fMRI) and Electroencephalogram (EEG)) across subjects. However, we find current dataset splitting strategies for cross-subject brain-to-text decoding are wrong. Specifically, we first demonstrate that all current splitting methods suffer from data leakage problem, which refers to the leakage of validation and test data into training set, resulting in significant overfitting and overestimation of decoding models. In this study, we develop a right cross-subject data splitting criterion without data leakage for decoding fMRI and EEG signal to text. Some SOTA brain-to-text decoding models are re-evaluated correctly with the proposed criterion for further research.
HierDiffuse: Progressive Diffusion for Robust Interest Fusion in CTR Prediction
Ziheng Ni
|
Congcong Liu
|
Yuying Chen
|
Zhiwei Fang
|
Changping Peng
|
Zhangang Lin
|
Ching Law
|
Jingping Shao
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track
Modern recommendation systems grapple with reconciling users’ enduring preferences with transient interests, particularly in click-through rate (CTR) prediction. Existing approaches inadequately fuse long-term behavioral profiles (e.g., aggregated purchase trends) and short-term interaction sequences (e.g., real-time clicks), suffering from representational misalignment and noise in transient signals. We propose HierDiffuse, a unified framework that redefines interest fusion as a hierarchical denoising process through diffusion models. Our approach addresses these challenges via three innovations: (1) A cross-scale diffusion mechanism aligns long- and short-term representations by iteratively refining long-term interests using short-term contextual guidance; (2) A Semantic Guidance Disentanglement (SGD) mechanism explicitly decouples core interests from noise in short-term signals;(3) Trajectory Convergence Constraint (TCC) is proposed to accelerate diffusion model reasoning without reducing generation quality to meet the constraints of high QPS (Queries Per Second) and low latency for online deployment of recommendation or advertising systems.HierDiffuse eliminates ad-hoc fusion operators, dynamically integrates multi-scale interests, and enhances robustness to spurious interactions as well as improves inference speed. Extensive experiments on real-world datasets demonstrate state-of-the-art performance, with significant improvements in CTR prediction accuracy and robustness to noisy interactions. Our work establishes diffusion models as a principled paradigm for adaptive interest fusion in recommendation systems.
Search
Fix author
Co-authors
- Changping Peng 2
- Yuying Chen 1
- Ching Law 1
- Piji Li (李丕绩) 1
- Zhangang Lin 1
- show all...