2 papers across 2 sessions
SplashNet adds Rolling Time Normalization, Aggressive Channel Masking, and a Split‑and‑Share bilateral encoder to sEMG typing, slashing the emg2qwerty baseline’s zero‑shot and fine‑tuned CERs by 31 % and 21 % respectively with half the parameters.
We conduct a user study to evaluate how well language models help humans internalize their reasoning, revealing that strong model performance alone doesn't guarantee effective reasoning transfer.