Incorporating experts’ judgment into machine learning models

نویسندگان

چکیده

Machine learning (ML) models have been quite successful in predicting outcomes many applications. However, some cases, domain experts might a judgment about the expected outcome that conflict with prediction of ML models. One main reason for this is training data not be totally representative population. In paper, we present novel framework aims at leveraging experts' to mitigate conflict. The underlying idea behind our first determine, using generative adversarial network, degree representation an unlabeled point data. Then, based on such degree, correct \textcolor{black}{machine learning} model's by incorporating into it, where higher aforementioned representation, less weight put expert intuition add corrected output, and vice-versa. We perform multiple numerical experiments synthetic as well two real-world case studies (one from IT services industry other financial industry). All results show effectiveness framework; it yields much closeness minimal sacrifice accuracy, when compared baseline methods. also develop new evaluation metric combines accuracy judgment. Our statistically significant evaluated metric.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Incorporating Prior Domain Knowledge Into Inductive Supervised Machine Learning Incorporating Prior Domain Knowledge Into Inductive Machine Learning

The paper reviews the recent developments of incorporating prior domain knowledge into inductive machine learning, and proposes a guideline that incorporates prior domain knowledge in three key issues of inductive machine learning algorithms: consistency, generalization and convergence. With respect to each issue, this paper gives some approaches to improve the performance of the inductive mach...

متن کامل

Incorporating Expert Judgement into Bayesian Network Machine Learning

We review the challenges of Bayesian network learning, especially parameter learning, and specify the problem of learning with sparse data.We explain how it is possible to incorporate both qualitative knowledge and data with a multinomial parameter learning method to achieve more accurate predictions with sparse data. 1 Review of Bayesian Network Learning Constructing a Bayesian network (BN) fr...

متن کامل

Incorporating Common Sense into a Machine Learning System ( Invited Paper )

The knowledge that must be acquired by machine learning systems which try to mimic common sense, as exhibited by humans, is inherently incomplete, redundant or even contradictory. Thus, the main characteristics of common sense is nonmonotonicity, which is introduced by exceptions to general rules, redundancy, which is introduced by continuous belief revisions and ambiguity, which is introduced ...

متن کامل

Incorporating Domain Models into Bayesian Optimization for Reinforcement Learning

In many Reinforcement Learning (RL) domains there is a high cost for generating experience in order to evaluate an agent’s performance. An appealing approach to reducing the number of expensive evaluations is Bayesian Optimization (BO), which is a framework for global optimization of noisy and costly to evaluate functions. Prior work in a number of RL domains has demonstrated the effectiveness ...

متن کامل

Incorporating Test Inputs into Learning

In many applications, such as credit default prediction and medical image recognition, test inputs are available in addition to the labeled training examples. We propose a method to incorporate the test inputs into learning. Our method results in solutions having smaller test errors than that of simple training solution, especially for noisy problems or small training sets.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Expert Systems With Applications

سال: 2023

ISSN: ['1873-6793', '0957-4174']

DOI: https://doi.org/10.1016/j.eswa.2023.120118