|
|
|
|
LEADER |
00000cam a2200000 a 4500 |
001 |
on1355217709 |
003 |
OCoLC |
005 |
20241006213017.0 |
006 |
m o d |
007 |
cr un|---aucuu |
008 |
221217s2022 xx o 000 0 eng d |
040 |
|
|
|a EBLCP
|b eng
|c EBLCP
|d GW5XE
|d YDX
|d FIE
|d UKAHL
|d OCLCF
|d SFB
|d OCLCO
|
019 |
|
|
|a 1354992744
|a 1367358741
|
020 |
|
|
|a 9783031066498
|q (electronic bk.)
|
020 |
|
|
|a 3031066499
|q (electronic bk.)
|
020 |
|
|
|z 9783031066481
|
020 |
|
|
|z 3031066480
|
024 |
7 |
|
|a 10.1007/978-3-031-06649-8
|2 doi
|
035 |
|
|
|a (OCoLC)1355217709
|z (OCoLC)1354992744
|z (OCoLC)1367358741
|
050 |
|
4 |
|a QA279.2
|
072 |
|
7 |
|a UYQM
|2 bicssc
|
072 |
|
7 |
|a COM004000
|2 bisacsh
|
072 |
|
7 |
|a UYQM
|2 thema
|
049 |
|
|
|a HCDD
|
100 |
1 |
|
|a Vovk, Vladimir,
|d 1960-
|
245 |
1 |
0 |
|a Algorithmic learning in a random world /
|c Vladimir Vovk, Alexander Gammerman, Glenn Shafer.
|
250 |
|
|
|a 2nd ed.
|
260 |
|
|
|a Cham :
|b Springer,
|c 2022.
|
300 |
|
|
|a 1 online resource (490 p.)
|
336 |
|
|
|a text
|b txt
|2 rdacontent
|
337 |
|
|
|a computer
|b c
|2 rdamedia
|
338 |
|
|
|a online resource
|b cr
|2 rdacarrier
|
505 |
0 |
|
|a Intro -- Contents -- Preface to the Second Edition -- Preface to the First Edition -- Notation and Abbreviations -- Sets, Bags, and Sequences -- Stochastics -- Machine Learning -- Programming -- Confidence Prediction -- Other Notations -- Abbreviations -- 1 Introduction -- 1.1 Machine Learning -- 1.1.1 Learning Under Randomness -- 1.1.2 Learning Under Unconstrained Randomness -- 1.2 A Shortcoming of Statistical Learning Theory -- 1.2.1 The Hold-Out Estimate of Confidence -- 1.2.2 The Contribution of This Book -- 1.3 The Online Framework -- 1.3.1 Online Learning
|
505 |
8 |
|
|a 1.3.2 Online/Offline Compromises -- 1.3.3 One-Off and Offline Learning -- 1.3.4 Induction, Transduction, and the Online Framework -- 1.4 Conformal Prediction -- 1.4.1 Nested Prediction Sets -- 1.4.2 Validity -- 1.4.3 Efficiency -- 1.4.4 Conditionality -- 1.4.5 Flexibility of Conformal Predictors -- 1.5 Probabilistic Prediction Under Unconstrained Randomness -- 1.5.1 Universally Consistent Probabilistic Predictor -- 1.5.2 Probabilistic Prediction Using a Finite Dataset -- 1.5.3 Venn Prediction -- 1.5.4 Conformal Predictive Distributions -- 1.6 Beyond Randomness -- 1.6.1 Testing Randomness
|
505 |
8 |
|
|a 1.6.2 Online Compression Models -- 1.7 Context -- Part I Set Prediction -- 2 Conformal Prediction: General Case and Regression -- 2.1 Confidence Predictors -- 2.1.1 Assumptions -- 2.1.2 Simple Predictors and Confidence Predictors -- 2.1.3 Validity -- 2.1.4 Randomized Confidence Predictors -- 2.1.5 Confidence Predictors Over a Finite Horizon -- 2.1.6 One-Off and Offline Confidence Predictors -- 2.2 Conformal Predictors -- 2.2.1 Bags -- 2.2.2 Nonconformity and Conformity -- 2.2.3 p-Values -- 2.2.4 Definition of Conformal Predictors -- 2.2.5 Validity -- 2.2.6 Smoothed Conformal Predictors
|
505 |
8 |
|
|a 2.2.7 Finite-Horizon Conformal Prediction -- 2.2.8 One-Off and Offline Conformal Predictors -- 2.2.9 General Schemes for Defining Nonconformity -- Conformity to a Bag -- Conformity to a Property -- 2.2.10 Deleted Conformity Measures -- 2.3 Conformalized Ridge Regression -- 2.3.1 Least Squares and Ridge Regression -- 2.3.2 Basic CRR -- 2.3.3 Two Modifications -- 2.3.4 Dual Form Ridge Regression -- 2.4 Conformalized Nearest Neighbours Regression -- 2.5 Efficiency of Conformalized Ridge Regression -- 2.5.1 Hard and Soft Models -- 2.5.2 Bayesian Ridge Regression -- 2.5.3 Efficiency of CRR
|
505 |
8 |
|
|a 2.6 Are There Other Ways to Achieve Validity? -- 2.7 Conformal Transducers -- 2.7.1 Definitions and Properties of Validity -- 2.7.2 Normalized Confidence Predictors and Confidence Transducers -- 2.8 Proofs -- 2.8.1 Proof of Theorem 2.2 -- 2.8.2 Proof of Theorem 2.7 -- Regularizing the Rays in Upper CRR -- Proof Proper -- 2.8.3 Proof of Theorem 2.10 -- 2.9 Context -- 2.9.1 Exchangeability vs Randomness -- 2.9.2 Conformal Prediction -- 2.9.3 Two Equivalent Definitions of Nonconformity Measures -- 2.9.4 The Two Meanings of Conformity in Conformal Prediction
|
500 |
|
|
|a 2.9.5 Examples of Nonconformity Measures
|
520 |
|
|
|a This book is about conformal prediction, an approach to prediction that originated in machine learning in the late 1990s. The main feature of conformal prediction is the principled treatment of the reliability of predictions. The prediction algorithms described conformal predictors are provably valid in the sense that they evaluate the reliability of their own predictions in a way that is neither over-pessimistic nor over-optimistic (the latter being especially dangerous). The approach is still flexible enough to incorporate most of the existing powerful methods of machine learning. The book covers both key conformal predictors and the mathematical analysis of their properties. Algorithmic Learning in a Random World contains, in addition to proofs of validity, results about the efficiency of conformal predictors. The only assumption required for validity is that of "randomness" (the prediction algorithm is presented with independent and identically distributed examples); in later chapters, even the assumption of randomness is significantly relaxed. Interesting results about efficiency are established both under randomness and under stronger assumptions. Since publication of the First Edition in 2005 conformal prediction has found numerous applications in medicine and industry, and is becoming a popular machine-learning technique. This Second Edition contains three new chapters. One is about conformal predictive distributions, which are more informative than the set predictions produced by standard conformal predictors. Another is about the efficiency of ways of testing the assumption of randomness based on conformal prediction. The third new chapter harnesses conformal testing procedures for protecting machine-learning algorithms against changes in the distribution of the data. In addition, the existing chapters have been revised, updated, and expanded.
|
588 |
0 |
|
|a Online resource; title from PDF title page (SpringerLink, viewed January 6, 2023).
|
650 |
|
0 |
|a Prediction theory.
|
650 |
|
0 |
|a Algorithms.
|
650 |
|
0 |
|a Stochastic processes.
|
650 |
|
7 |
|a algorithms.
|2 aat
|
650 |
|
7 |
|a Algorithms
|2 fast
|
650 |
|
7 |
|a Prediction theory
|2 fast
|
650 |
|
7 |
|a Stochastic processes
|2 fast
|
650 |
|
7 |
|a Teoria de la predicci��
|2 thub
|
650 |
|
7 |
|a Algorismes.
|2 thub
|
650 |
|
7 |
|a Processos estoc��stics.
|2 thub
|
655 |
|
7 |
|a Llibres electr��nics.
|2 thub
|
700 |
1 |
|
|a Gammerman, A.
|q (Alexander)
|
700 |
1 |
|
|a Shafer, Glenn,
|d 1946-
|
776 |
0 |
8 |
|i Print version:
|a Vovk, Vladimir
|t Algorithmic Learning in a Random World
|d Cham : Springer International Publishing AG,c2022
|z 9783031066481
|
856 |
4 |
0 |
|u https://holycross.idm.oclc.org/login?auth=cas&url=https://link.springer.com/10.1007/978-3-031-06649-8
|y Click for online access
|
903 |
|
|
|a SPRING-MATH2022
|
994 |
|
|
|a 92
|b HCD
|