Toward a Justification of Meta-learning: Is the No Free Lunch Theorem a Show-stopper?
نویسنده
چکیده
We present a preliminary analysis of the fundamental viability of meta-learning, revisiting the No Free Lunch (NFL) theorem. The analysis shows that given some simple and very basic assumptions, the NFL theorem is of little relevance to research in Machine Learning. We augment the basic NFL framework to illustrate that the notion of an Ultimate Learning Algorithm is well defined. We show that, although cross-validation still is not a viable way to construct general-purpose learning algorithms, meta-learning offers a natural alternative. We still have to pay for our lunch, but the cost is reasonable: the necessary fundamental assumptions are ones we all make anyway.
منابع مشابه
An Empirical Overview of the No Free Lunch Theorem and Its Effect on Real-World Machine Learning Classification
A sizable amount of research has been done to improve the mechanisms for knowledge extraction such as machine learning classification or regression. Quite unintuitively, the no free lunch (NFL) theorem states that all optimization problem strategies perform equally well when averaged over all possible problems. This fact seems to clash with the effort put forth toward better algorithms. This le...
متن کاملThe No Free Lunch Theorem Disproved by Counterexample: A Justification for Regrouing
After deriving the particle swarm equations from basic physics, this paper shows by contradiction that NFL Theorem 1, and consequently Theorems 2 and 3, are irrelevant to continuous optimization. As the discrete nature of matter at its most fundamental level is generally irrelevant from the broader classical perspective, so to is the discrete nature of an optimization problem at its most fundam...
متن کاملLearning Using Anti-Training with Sacrificial Data
Traditionally the machine-learning community has viewed the No Free Lunch (NFL) theorems for search and optimization as a limitation. We review, analyze, and unify the NFL theorem with the perspectives of ``blind"" search and meta-learning to arrive at necessary conditions for improving black-box optimization. We survey meta-learning literature to determine when and how meta-learning can benefi...
متن کاملNo-Free-Lunch and Bayesian Optimality
We take a Bayesian approach to the issues of bias, meta bias, transfer, overfit, and No-Free-Lunch in the context of supervised learning. If we accept certain relationships between the function class, on training set data, and off training set data, then a graphical model can be created that represents the supervised learning problem. This graphical model dictates a specific algorithm which wil...
متن کاملNo Free Lunch Theorem , Inductive Skepticism , and the Optimality of Meta - Induction Word count : 4986
The no free lunch theorem (Wolpert 1996) is a radicalized version of Hume's induction skepticism. It asserts that relative to a uniform probability distribution over all possible worlds, all computable prediction algorithms whether 'clever' inductive or 'stupid' guessing methods (etc.) have the same expected predictive success. This theorem seems to be in conflict with results about meta-in...
متن کامل