Gupta et al.: Boosting for regression transfer via importance sampling

Gupta S, Bi J, Liu Y, Wildani A. 2023. Boosting for regression transfer via importance sampling. Int J Data Sci Anal.

Instance transfer learning methodologies are extremely efficient for continuous-valued, regression datasets. However, these methodologies can suffer negative transfer due to distribution shifts between the training and test data as well as skewness in training caused by the large-sampled source dataset. To mitigate this, we introduce S-TradaBoost.R2, a boosting-based instance transfer learning methodology that utilizes importance sampling to reduce the skewness in training and a balanced weighing approach for the distribution shift. We tested the performance of our approach on 8 standard regression datasets with varying complexities and found that S-TrAdaBoost.R2 performs better than the competitive transfer learning methodologies 63% of the time. Moreover, It also displayed a consistent performance as opposed to the sporadic results observed for other transfer learning methodologies.

Article link