Monotone conditional complexity bounds on future prediction errors

Alexey Chernov, Marcus Hutter

Research output: Chapter in Book/Conference proceeding with ISSN or ISBNConference contribution with ISSN or ISBNResearchpeer-review

Abstract

We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution m by the algorithmic complexity of m. Here we assume we are at a time t>1 and already observed x=x 1...xt. We bound the future prediction performance on xt+1xt+2... by a new variant of algorithmic complexity of m given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.
Original languageEnglish
Title of host publicationProceedings of the 16th International Conference Algorithmic Learning Theory 2005
Place of PublicationBerlin Heidelberg
Pages414-428
Number of pages15
Volume3734
DOIs
Publication statusPublished - 31 Dec 2005
EventProceedings of the 16th International Conference Algorithmic Learning Theory 2005 - Singapore, October 8-11, 2005
Duration: 31 Dec 2005 → …

Publication series

NameLecture Notes in Computer Science

Conference

ConferenceProceedings of the 16th International Conference Algorithmic Learning Theory 2005
Period31/12/05 → …

Fingerprint

Prediction Error
Algorithmic Complexity
Monotone
Performance Prediction
Bayesian Model
Classification Problems
Randomness
Predictors
Deviation
Decrease

Cite this

Chernov, A., & Hutter, M. (2005). Monotone conditional complexity bounds on future prediction errors. In Proceedings of the 16th International Conference Algorithmic Learning Theory 2005 (Vol. 3734, pp. 414-428). (Lecture Notes in Computer Science). Berlin Heidelberg. https://doi.org/10.1007/11564089_32
Chernov, Alexey ; Hutter, Marcus. / Monotone conditional complexity bounds on future prediction errors. Proceedings of the 16th International Conference Algorithmic Learning Theory 2005. Vol. 3734 Berlin Heidelberg, 2005. pp. 414-428 (Lecture Notes in Computer Science).
@inproceedings{250a8ef1da1144f08c6ac46ae730bcfa,
title = "Monotone conditional complexity bounds on future prediction errors",
abstract = "We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution m by the algorithmic complexity of m. Here we assume we are at a time t>1 and already observed x=x 1...xt. We bound the future prediction performance on xt+1xt+2... by a new variant of algorithmic complexity of m given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.",
author = "Alexey Chernov and Marcus Hutter",
year = "2005",
month = "12",
day = "31",
doi = "10.1007/11564089_32",
language = "English",
volume = "3734",
series = "Lecture Notes in Computer Science",
pages = "414--428",
booktitle = "Proceedings of the 16th International Conference Algorithmic Learning Theory 2005",

}

Chernov, A & Hutter, M 2005, Monotone conditional complexity bounds on future prediction errors. in Proceedings of the 16th International Conference Algorithmic Learning Theory 2005. vol. 3734, Lecture Notes in Computer Science, Berlin Heidelberg, pp. 414-428, Proceedings of the 16th International Conference Algorithmic Learning Theory 2005, 31/12/05. https://doi.org/10.1007/11564089_32

Monotone conditional complexity bounds on future prediction errors. / Chernov, Alexey; Hutter, Marcus.

Proceedings of the 16th International Conference Algorithmic Learning Theory 2005. Vol. 3734 Berlin Heidelberg, 2005. p. 414-428 (Lecture Notes in Computer Science).

Research output: Chapter in Book/Conference proceeding with ISSN or ISBNConference contribution with ISSN or ISBNResearchpeer-review

TY - GEN

T1 - Monotone conditional complexity bounds on future prediction errors

AU - Chernov, Alexey

AU - Hutter, Marcus

PY - 2005/12/31

Y1 - 2005/12/31

N2 - We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution m by the algorithmic complexity of m. Here we assume we are at a time t>1 and already observed x=x 1...xt. We bound the future prediction performance on xt+1xt+2... by a new variant of algorithmic complexity of m given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.

AB - We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution m by the algorithmic complexity of m. Here we assume we are at a time t>1 and already observed x=x 1...xt. We bound the future prediction performance on xt+1xt+2... by a new variant of algorithmic complexity of m given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.

U2 - 10.1007/11564089_32

DO - 10.1007/11564089_32

M3 - Conference contribution with ISSN or ISBN

VL - 3734

T3 - Lecture Notes in Computer Science

SP - 414

EP - 428

BT - Proceedings of the 16th International Conference Algorithmic Learning Theory 2005

CY - Berlin Heidelberg

ER -

Chernov A, Hutter M. Monotone conditional complexity bounds on future prediction errors. In Proceedings of the 16th International Conference Algorithmic Learning Theory 2005. Vol. 3734. Berlin Heidelberg. 2005. p. 414-428. (Lecture Notes in Computer Science). https://doi.org/10.1007/11564089_32