Algorithmic complexity bounds on future prediction errors

Alexey Chernov, Marcus Hutter, Juergen Schmidhuber

Research output: Contribution to journalArticleResearchpeer-review

Abstract

We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution μ by the algorithmic complexity of μ . Here we assume that we are at a time t>1 and have already observed x = x 1...xt . We bound the future prediction performance on xt+1xt+2... by a new variant of algorithmic complexity of μ given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.
Original languageEnglish
Pages (from-to)242-261
Number of pages20
JournalInformation And Computation
Volume205
Issue number2
DOIs
Publication statusPublished - 28 Feb 2007

Fingerprint

Algorithmic Complexity
Prediction Error
Performance Prediction
Bayesian Model
Classification Problems
Randomness
Predictors
Monotone
Deviation
Decrease

Bibliographical note

© 2007. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/

Keywords

  • Kolmogorov complexity
  • posterior bounds
  • online sequential prediction
  • Solomonoff prior
  • monotone conditional complexity
  • total error
  • future loss
  • randomness deficiency

Cite this

Chernov, Alexey ; Hutter, Marcus ; Schmidhuber, Juergen. / Algorithmic complexity bounds on future prediction errors. 2007 ; Vol. 205, No. 2. pp. 242-261.
@article{40243406ddc147b882ec2903c47a8b35,
title = "Algorithmic complexity bounds on future prediction errors",
abstract = "We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution μ by the algorithmic complexity of μ . Here we assume that we are at a time t>1 and have already observed x = x 1...xt . We bound the future prediction performance on xt+1xt+2... by a new variant of algorithmic complexity of μ given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.",
keywords = "Kolmogorov complexity, posterior bounds, online sequential prediction, Solomonoff prior, monotone conditional complexity, total error, future loss, randomness deficiency",
author = "Alexey Chernov and Marcus Hutter and Juergen Schmidhuber",
note = "{\circledC} 2007. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/",
year = "2007",
month = "2",
day = "28",
doi = "10.1016/j.ic.2006.10.004",
language = "English",
volume = "205",
pages = "242--261",
number = "2",

}

Algorithmic complexity bounds on future prediction errors. / Chernov, Alexey; Hutter, Marcus; Schmidhuber, Juergen.

Vol. 205, No. 2, 28.02.2007, p. 242-261.

Research output: Contribution to journalArticleResearchpeer-review

TY - JOUR

T1 - Algorithmic complexity bounds on future prediction errors

AU - Chernov, Alexey

AU - Hutter, Marcus

AU - Schmidhuber, Juergen

N1 - © 2007. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/

PY - 2007/2/28

Y1 - 2007/2/28

N2 - We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution μ by the algorithmic complexity of μ . Here we assume that we are at a time t>1 and have already observed x = x 1...xt . We bound the future prediction performance on xt+1xt+2... by a new variant of algorithmic complexity of μ given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.

AB - We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution μ by the algorithmic complexity of μ . Here we assume that we are at a time t>1 and have already observed x = x 1...xt . We bound the future prediction performance on xt+1xt+2... by a new variant of algorithmic complexity of μ given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.

KW - Kolmogorov complexity

KW - posterior bounds

KW - online sequential prediction

KW - Solomonoff prior

KW - monotone conditional complexity

KW - total error

KW - future loss

KW - randomness deficiency

U2 - 10.1016/j.ic.2006.10.004

DO - 10.1016/j.ic.2006.10.004

M3 - Article

VL - 205

SP - 242

EP - 261

IS - 2

ER -