##### Document Actions

Nico Görnitz, Christian Widmer, Georg Zeller, Andre Kahles, Sören Sonnenburg, and Gunnar Rätsch (2011)

# Hierarchical Multitask Structured Output Learning for Large-scale Sequence Segmentation

In: Advances in Neural Information Processing Systems (NIPS'11), NIPS Foundation.

We present a novel regularization-based Multitask Learning (MTL) formulation
for Structured Output (SO) prediction for the case of hierarchical task relations.
Structured output prediction often leads to difficult inference problems and hence
requires large amounts of training data to obtain accurate models. We propose to
use MTL to exploit additional information from related learning tasks by means of
hierarchical regularization. Training SO models on the combined set of examples
from multiple tasks can easily become infeasible for real world applications. To
be able to solve the optimization problems underlying multitask structured output
learning, we propose an efficient algorithm based on bundle-methods. We
demonstrate the performance of our approach in applications from the domain of
computational biology addressing the key problem of gene finding. We show that
1) our proposed solver achieves much faster convergence than previous methods
and 2) that the Hierarchical SO-MTL approach outperforms considered non-MTL
methods.

accepted.