2 papers across 2 sessions
We introduce Recursive Inference Scaling (RIS), a plug-in technique that exploits language's fractal structure to boost model performance, unlocking a new dimension of inference scaling.
We propose correlation dimension as a practical, model-agnostic metric that captures structural complexity and detects degeneration in large language model outputs beyond what perplexity reveals.