Pareto Manifold Learning: Tackling multiple tasks via ensembles of single-task models

10/18/2022
by   Nikolaos Dimitriadis, et al.
4

In Multi-Task Learning, tasks may compete and limit the performance achieved on each other rather than guiding the optimization trajectory to a common solution, superior to its single-task counterparts. There is often not a single solution that is optimal for all tasks, leading practitioners to balance tradeoffs between tasks' performance, and to resort to optimality in the Pareto sense. Current Multi-Task Learning methodologies either completely neglect this aspect of functional diversity, and produce one solution in the Pareto Front predefined by their optimization schemes, or produce diverse but discrete solutions, each requiring a separate training run. In this paper, we conjecture that there exist Pareto Subspaces, i.e., weight subspaces where multiple optimal functional solutions lie. We propose Pareto Manifold Learning, an ensembling method in weight space that is able to discover such a parameterization and produces a continuous Pareto Front in a single training run, allowing practitioners to modulate the performance on each task during inference on the fly. We validate the proposed method on a diverse set of multi-task learning benchmarks, ranging from image classification to tabular datasets and scene understanding, and show that Pareto Manifold Learning outperforms state-of-the-art algorithms.

READ FULL TEXT

page 8

page 29

page 30

page 32

page 33

research
12/30/2019

Pareto Multi-Task Learning

Multi-task learning is a powerful method for solving multiple correlated...
research
06/29/2020

Efficient Continuous Pareto Exploration in Multi-Task Learning

Tasks in multi-task learning often correlate, conflict, or even compete ...
research
10/10/2018

Multi-Task Learning as Multi-Objective Optimization

In multi-task learning, multiple tasks are solved jointly, sharing induc...
research
08/02/2021

Exact Pareto Optimal Search for Multi-Task Learning: Touring the Pareto Front

Multi-Task Learning (MTL) is a well-established paradigm for training de...
research
02/15/2020

Multi-Task Multicriteria Hyperparameter Optimization

We present a new method for searching optimal hyperparameters among seve...
research
08/27/2023

Revisiting Scalarization in Multi-Task Learning: A Theoretical Perspective

Linear scalarization, i.e., combining all loss functions by a weighted s...
research
04/02/2023

Finding Pareto Efficient Redistricting Plans with Short Bursts

Redistricting practitioners must balance many competing constraints and ...

Please sign up or login with your details

Forgot password? Click here to reset