Correction of Regression Predictions Using the Secondary Learner on the Sensitivity Analysis Outputs
Keywords:Regression, predictions, correction of predictions, sensitivity analysis, prediction error, prediction accuracy
AbstractFor a given regression model, each individual prediction may be more or less accurate. The average accuracy of the system cannot provide the error estimate for a single particular prediction, which could be used to correct the prediction to a more accurate value. We propose a method for correction of the regression predictions that is based on the sensitivity analysis approach. Using predictions, gained in sensitivity analysis procedure, we build a secondary regression predictor whose task is to predict the signed error of the prediction which was made using the original regression model. We test the proposed methodology using four regression models: locally weighted regression, linear regression, regression trees and neural networks. The results of our experiments indicate significant increase of prediction accuracy in more than 20% of experiments. The favorable results prevale especially with the regression trees and neural networks, where locally weighted regression was used as a model for predicting the prediction error. In these experiments the prediction accuracy increased in 60% of experiments with regression trees and in 50% of experiments with neural networks, while the increase of the prediction error did not occur in any experiment.
Download data is not yet available.
How to Cite
Bosnić, Z., & Kononenko, I. (2012). Correction of Regression Predictions Using the Secondary Learner on the Sensitivity Analysis Outputs. COMPUTING AND INFORMATICS, 29(6), 929–946. Retrieved from https://www.cai.sk/ojs/index.php/cai/article/view/119