Prediction of Oil Prices Using Bagging and Random Subspace
نویسندگان
چکیده
The problem of predicting oil prices is worthy of attention. As oil represents the backbone of the world economy, the goal of this paper is to design a model, which is more accurate. We modeled the prediction process comprising of three steps: feature selection, data partitioning and analyzing the prediction models. Six prediction models namely: Multi-Layered Perceptron (MLP), Sequential Minimal Optimization for regression (SMOreg), Isotonic Regression, Multilayer Perceptron Regressor (MLP Regressor), Extra-Tree and Reduced Error Pruning Tree (REPtree). These prediction models were selected and tested after experimenting with other several most widely used prediction models. The comparison of these six algorithms with previous work is presented based on Root mean squared error (RMSE) to find out the best suitable algorithm. Further, two meta schemes namely Bagging and Random subspace are adopted and compared with previous algorithms using Mean squared error (MSE) to evaluate performance. Experimental evidence illustrate that the random subspace scheme outperforms most of the existing techniques.
منابع مشابه
Entropy-Based Bagging for Fault Prediction of Transformers Using Oil-Dissolved Gas Data
The development of the smart grid has resulted in new requirements for fault prediction of power transformers. This paper presents an entropy-based Bagging (E-Bagging) method for prediction of characteristic parameters related to power transformers faults. A parameter of comprehensive information entropy of sample data is brought forward to improve the resampling process of the E-Bagging method...
متن کاملCombining Bagging, Boosting and Random Subspace Ensembles for Regression Problems
Bagging, boosting and random subspace methods are well known re-sampling ensemble methods that generate and combine a diversity of learners using the same learning algorithm for the base-regressor. In this work, we built an ensemble of bagging, boosting and random subspace methods ensembles with 8 sub-regressors in each one and then an averaging methodology is used for the final prediction. We ...
متن کاملApplication of ensemble learning techniques to model the atmospheric concentration of SO2
In view of pollution prediction modeling, the study adopts homogenous (random forest, bagging, and additive regression) and heterogeneous (voting) ensemble classifiers to predict the atmospheric concentration of Sulphur dioxide. For model validation, results were compared against widely known single base classifiers such as support vector machine, multilayer perceptron, linear regression and re...
متن کاملInvestigation of Property Valuation Models Based on Decision Tree Ensembles Built over Noised Data
The ensemble machine learning methods incorporating bagging, random subspace, random forest, and rotation forest employing decision trees, i.e. Pruned Model Trees, as base learning algorithms were developed in WEKA environment. The methods were applied to the real-world regression problem of predicting the prices of residential premises based on historical data of sales/purchase transactions. T...
متن کاملImproving experimental studies about ensembles of classifiers for bankruptcy prediction and credit scoring
Previous studies about ensembles of classifiers for bankruptcy prediction and credit scoring have been presented. In these studies, different ensemble schemes for complex classifiers were applied, and the best results were obtained using the Random Subspace method. The Bagging scheme was one of the ensemble methods used in the comparison. However, it was not correctly used. It is very important...
متن کامل