TY - GEN
T1 - Estimating relatedness via data compression
AU - Juba, Brendan
PY - 2006
Y1 - 2006
N2 - We show that it is possible to use data compression on independently obtained hypotheses from various tasks to algorithmically provide guarantees that the tasks are sufficiently related to benefit from multitask learning. We give uniform bounds in terms of the empirical average error for the true average error of the n hypotheses provided by deterministic learning algorithms drawing independent samples from a set of n unknown computable task distributions over finite sets.
AB - We show that it is possible to use data compression on independently obtained hypotheses from various tasks to algorithmically provide guarantees that the tasks are sufficiently related to benefit from multitask learning. We give uniform bounds in terms of the empirical average error for the true average error of the n hypotheses provided by deterministic learning algorithms drawing independent samples from a set of n unknown computable task distributions over finite sets.
UR - https://www.scopus.com/pages/publications/34250719249
U2 - 10.1145/1143844.1143900
DO - 10.1145/1143844.1143900
M3 - Conference contribution
AN - SCOPUS:34250719249
SN - 1595933832
SN - 9781595933836
T3 - ACM International Conference Proceeding Series
SP - 441
EP - 448
BT - ACM International Conference Proceeding Series - Proceedings of the 23rd International Conference on Machine Learning, ICML 2006
T2 - 23rd International Conference on Machine Learning, ICML 2006
Y2 - 25 June 2006 through 29 June 2006
ER -