Transfer Learning with Adaptive Online TrAdaBoost for Data Streams

Ocean Wu, Yun Sing Koh, Gillian Dobbie, Thomas Lacombe
Proceedings of The 13th Asian Conference on Machine Learning, PMLR 157:1017-1032, 2021.

Abstract

In many real-world applications, data are often produced in the form of streams. Consider, for example, data produced by sensors. In data streams there can be concept drift where the distribution of the data changes. When we deal with multiple streams from the same domain, concepts that have occurred in one stream may occur in another. Therefore, being able to reuse knowledge across multiple streams can help models recover from concept drifts more quickly. A major challenge is that these data streams may be only partially identical and a direct adoption would not suffice. In this paper, we propose a novel framework to transfer both identical and partially identical concepts across different streams. In particular, we propose a new technique called Adaptive Online TrAdaBoost that tunes weight adjustments during boosting based on model performance. The experiments on synthetic data verify the desired properties of the proposed method, and the experiments on real-world data show the better performance of the method for data stream mining compared with its baselines.

Cite this Paper


BibTeX
@InProceedings{pmlr-v157-wu21b, title = {Transfer Learning with Adaptive Online TrAdaBoost for Data Streams}, author = {Wu, Ocean and Koh, Yun Sing and Dobbie, Gillian and Lacombe, Thomas}, booktitle = {Proceedings of The 13th Asian Conference on Machine Learning}, pages = {1017--1032}, year = {2021}, editor = {Balasubramanian, Vineeth N. and Tsang, Ivor}, volume = {157}, series = {Proceedings of Machine Learning Research}, month = {17--19 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v157/wu21b/wu21b.pdf}, url = {https://proceedings.mlr.press/v157/wu21b.html}, abstract = {In many real-world applications, data are often produced in the form of streams. Consider, for example, data produced by sensors. In data streams there can be concept drift where the distribution of the data changes. When we deal with multiple streams from the same domain, concepts that have occurred in one stream may occur in another. Therefore, being able to reuse knowledge across multiple streams can help models recover from concept drifts more quickly. A major challenge is that these data streams may be only partially identical and a direct adoption would not suffice. In this paper, we propose a novel framework to transfer both identical and partially identical concepts across different streams. In particular, we propose a new technique called Adaptive Online TrAdaBoost that tunes weight adjustments during boosting based on model performance. The experiments on synthetic data verify the desired properties of the proposed method, and the experiments on real-world data show the better performance of the method for data stream mining compared with its baselines.} }
Endnote
%0 Conference Paper %T Transfer Learning with Adaptive Online TrAdaBoost for Data Streams %A Ocean Wu %A Yun Sing Koh %A Gillian Dobbie %A Thomas Lacombe %B Proceedings of The 13th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Vineeth N. Balasubramanian %E Ivor Tsang %F pmlr-v157-wu21b %I PMLR %P 1017--1032 %U https://proceedings.mlr.press/v157/wu21b.html %V 157 %X In many real-world applications, data are often produced in the form of streams. Consider, for example, data produced by sensors. In data streams there can be concept drift where the distribution of the data changes. When we deal with multiple streams from the same domain, concepts that have occurred in one stream may occur in another. Therefore, being able to reuse knowledge across multiple streams can help models recover from concept drifts more quickly. A major challenge is that these data streams may be only partially identical and a direct adoption would not suffice. In this paper, we propose a novel framework to transfer both identical and partially identical concepts across different streams. In particular, we propose a new technique called Adaptive Online TrAdaBoost that tunes weight adjustments during boosting based on model performance. The experiments on synthetic data verify the desired properties of the proposed method, and the experiments on real-world data show the better performance of the method for data stream mining compared with its baselines.
APA
Wu, O., Koh, Y.S., Dobbie, G. & Lacombe, T.. (2021). Transfer Learning with Adaptive Online TrAdaBoost for Data Streams. Proceedings of The 13th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 157:1017-1032 Available from https://proceedings.mlr.press/v157/wu21b.html.

Related Material